搜索资源列表
小叮咚分词模块
- 小叮呼的分词模块 小叮呼的分词模块-small bite called the Word module called the small bite-term m odule
百度分词词库
- 据说是百度以前用的中文分词词典,希望对大家有一点帮助哈,快下快下-allegedly Baidu before the Chinese word dictionaries, we hope to have a bit of help to Kazakhstan, where fast under fast!
分词模块
- 一个非常有用的分词模块,对研究搜索引擎的人有参考价值-a very useful segmentation module, the study of search engines reference value
中文分词函数库CipSegSDKV1.03
- 东大做中文分词的源代码,主要是用于搜索引擎的中文文本预处理-Tung Chinese-made version of the source code is mainly for the Chinese search engine Hypertext
Delphi实现的简单中文分词v1.1
- Delphi实现的简单中文分词,Delphi实现的简单中文分词-Delphi simple Chinese word Delphi simple Chinese word
庖丁分词工具
- 一个流行的java分词程序。
本程序可以实现对已有网页的信息提取和分词
- 本程序可以实现对已有网页的信息提取和分词,结果会导入叫做res.txt的文件中。本程序是开发搜索引擎的前期工作。-This procedure can be achieved on existing Web information extraction and segmentation, the results into a file called res.txt. This program is the development of the preliminary work the searc
Chinesewordsegmentationalgorithm
- 中文分词算法,跟金山词霸一样,当鼠标移动到语句上时,能自动分割词语-Chinese word segmentation algorithm with the same PowerWord, when the mouse moved to sentence when the words automatically partition
File_Search
- 中英文分词程序,在文本检索中要使用到得一个小程序哦-In English and Chinese word segmentation procedure, text retrieval, we should use to get a small program oh
Codes_and_Application
- 中科院的的分词工具,应该是分中文用的,效率不错-Chinese Academy of Sciences of the sub-word tools, should be used at the Chinese, the efficiency of a good
css
- 用VISUAL C++编写的中文分词系统C-Using VISUAL C++ Prepared Chinese word segmentation system C
include
- 用VISUAL C++编写的中文分词系统中的INCULDE算法-Using VISUAL C++ Prepared Chinese word segmentation system INCULDE algorithm
utils
- 用VISUAL C++编写的中文分词系统 UTILS算法-Using VISUAL C++ Prepared Chinese word segmentation system Utils algorithm
KTDictSeg_V1.3.01
- KTDictSeg 是由KaiToo搜索开发的一款基于字典的简单中英文分词算法-KaiToo search by KTDictSeg developed a simple dictionary based on English and Chinese word segmentation algorithm
NLuke0.12
- 这是一个基于网络的,扩展了lunce的一个搜索分词工具-This is a web-based, expanded lunce participle of a search tool
SentenceSplit
- 中文分词组件。小型搜索引擎用中文分词组件, 内带词库-Sentence Split
src
- 利用lucene编写的一个简单搜索引擎,能够中文分词。-a simple search engine built with lucene.
SearchEngine
- C#+Lucene.Net开发完成的一个自定义WEB搜索引擎,本项目实现了分词、模糊索引,加以Lucene.Net内部核心功能共同实现了搜索机制引擎-C#+ Lucene.Net developed a custom WEB search engine, the project achieved a sub-word, fuzzy indexing, Lucene.Net be the core function of the internal search mechanism to achie
UseHLSSplit(Fix)
- 中文分词处理,delphi调用海量智能分词库,修改了网上另一个版本的错误。-Chinese word processing, delphi call the massive intelligence points thesaurus, revised the online version of the error to another.
完整的站内搜索引擎(Lucene.Net+盘古分词)
- 盘古分词法+全站搜索功能DEMO,仅供参考。(Pangu Word Segmentation + Full Station Search Function DEMO, for reference only)