搜索资源列表
KTDictSeg_V1.3.01
- KTDictSeg 是由KaiToo搜索开发的一款基于字典的简单中英文分词算法-KaiToo search by KTDictSeg developed a simple dictionary based on English and Chinese word segmentation algorithm
zhishifencisuanfa
- 基于知识分词算法的病案全文检索系统, 硕士论文.-Knowledge-based segmentation algorithm cases full-text retrieval system, a master s thesis.
p
- 分词组件 for .net 2.0,即将发布。 o-Word component for. Net 2.0, the upcoming release. o
ir
- 本系统实现了分词和倒排索引,分词采用正向最大匹配,-The system achieved a sub-word and the inverted index, the biggest being the use of sub-word match,
MmFenCi
- 基于MM的分词算法,有兴趣者可以把程序中没有完成的部分继续。-MM sub-word based algorithm, are interested in can not complete the program part to continue.
WordSeg
- 简单分词程序 读入一个pdf 输出一个分好词的txt-Reading of simple segmentation procedure into a pdf output of a good word txt
NLuke0.12
- 这是一个基于网络的,扩展了lunce的一个搜索分词工具-This is a web-based, expanded lunce participle of a search tool
2004050215271615762
- 本文是关于中文分词程序源代码,仅供大家学习和研究使用。-This article is about the Chinese word segmentation program source code, for everyone to learn and study.
participle_1.0
- 具有丰富的分词功能,可以为学习中文分词的用户提供一些参考-Segmentation is rich in features, you can learn Chinese word for the user to provide some reference
SentenceSplit
- 中文分词组件。小型搜索引擎用中文分词组件, 内带词库-Sentence Split
ParseWord
- 利用字典进行文章分词,正向最大匹配,你想最大匹配,并包含一个词汇表,vs 2005, c#-parse word, vs2005, c#
wordsegment1
- 中文分词算法,效率很高,使用词典树装搜索进行单词切割,并提供扩充词库的函数!-It s an arithmetc of word segment,it has a very high efficiency!
ChineseTokenizer
- 中文分词演示程序,可以对字符串进行按照词义分词。在搜索技术中有广泛应用。-Chinese Segmentation demo program can be carried out in accordance with the meaning of the string segmentation. In the search technology has wide application.
WordSeg--JAVA
- 本程序采用正向 逆向最大匹配才实现汉字分词。-This procedure being used to achieve reverse maximum matching segmentation of Chinese characters.
sharpictclas
- 中科院分词系统的CSharp版,采用隐马尔科夫模型,识别率非常高-Word Segmentation System, Chinese Academy of Sciences CSharp version
ictclaszyfc-v2009
- 中科院分词系统,包含添加词汇、统计词频等。-Chinese Academy of Sciences segmentation system, including adding vocabulary, word frequency and other statistics.
HMM
- 基于统计的分词,采用隐马尔可夫模型,并有实验报告-Based on statistics segmentation using hidden Markov models, and there is experimental report
maxseg
- 最大匹配分词系统,分词中最见大的,效果也是很好的系统。-maximum matching segmentation
MaximumMatching
- 利用最大匹配法进行分词。需要一个词典。就可以进行分词。性能很好-Carried out using the maximum matching word segmentation. Need a dictionary. Segmentation can be carried out. Good performance
keyword-chouqu
- 基于逆向最大匹配算法的分词及基于HMM模型的词性标注系统,包括了未登录词的识别、数据库的添加等内容。(需要手动修改数据库的路径才可以运行)-Reverse Maximum Matching Algorithm Based on the sub-word HMM-based model and part of speech tagging system, including the unknown word identification, such as the contents of the d