搜索资源列表
nlp
- 中文分词程序。NLP课程的程序设计,实现中文分词-Chinese word segmentation program. NLP course program designed to achieve the Chinese word segmentation
chinese-analyzer
- imdict chinese analyzer分词程序 ,是中科院ICTCALS的重实现,加入了lucene的分词jar包,完整的程序-word program imdict chinese analyzer is the Chinese Academy ICTCALS heavy realize adding a the lucene segmentation jar package, complete program
hmmWordSegmentation
- 这是一个基于hmm模型的句子分词程序,语言是python,目前输入语句不支持标点符号。-This program is for divising a sentence into seperate words base on hmm.
041206ChSeg
- 北大天网分词程序,个人研究了感觉很不错,可以了解分词的过程!-North Skynet segmentation procedure, personal feel very good, can understand the word!
Segmentation
- 简单的Python分词程序,即将一整段话分隔并且在其中加入标点停顿。-The simple Python segmentation procedure is about a whole paragraph, then separated and added punctuation to a standstill.
Segmnet
- 基于ictclas的C#环境的分词程序,供大家参考-The segmentation based the C# ictclas environment program
7675567
- VB高频分词程序源代码,有需要的就下载吧。很好的资料。-VB-frequency word program source code, there is a need to download it. Very good information.
split_words
- 分词程序,正向最大匹配法,JAVA语言。核心思想是从句子最左端开始,单字节扫描匹配,直至句末-Segmentation procedure, forward maximum matching, JAVA language. Core idea is to start from the leftmost sentence, single-byte scan match until end of the sentence
ctbparser_0.12.tar
- 自然语言处理技术的基础技术:中文分词经过艰苦的研发,终于发布了。中文分词是互联网应用不可缺少的基础技术之一,也是语音和语言产品必不可少的技术组件。现在上传一个ctbparser中文分词程序。 -Natural language processing techniques based technologies: Chinese word segmentation through painstaking research and development, has finally been relea
fenci
- 自然语言理解课程实验,基于百度词库的分词程序-Natural language understanding course experiment, based Baidu thesaurus segmentation procedure
ansj_seg-master
- 一个功能非常全面的分词程序,内部有许多测试类可以使用,包含了词频的统计功能在其中,可以-A very comprehensive segmentation procedures, internal classes can use many tests, including word frequency statistics function in which you can see under the next
ygrx
- 这是一个简单的中文分词程序,可以在没有语料库的情况下通过各个词语的相关度将中文词汇抽取出来-A simple Chinese segmentation procedures, speed is also good
program
- 中文分词程序,使用的是1998年的人民日报语料进行的处理,有做相同研究的可以下载使用-Chinese word segmentation program
automatic-word-segmentation
- 实现一个中文自动分词程序,所使用的编程语言不限 选作:对人名,地名,机构名的识别 下载北大计算语言所标注的99年人民日报分词语料库,构建一个词表 实现正向、逆向最大分词算法-To implement a Chinese automatic word segmentation procedure, used by any programming language Chosen for: the person names, place names, organization name
Fenci
- 中文分词程序源码,包含所用到的词库词典。-Chinese word segmentation program source code, including the use of the thesaurus dictionary.
SplitWords
- 基于lucene的文档分词程序,去停用词,统计词频,计算词的权重-Lucene-based document segmentation procedures, to stop words, word frequency statistics
databayy
- 一份很重要的语料库,为你的分词程序是一个很好用的资料库文件-An important corpus, word segmentation procedure for you is a very useful files
matlab程序
- 用于计算中文分词的正向最大匹配算法、基于matlab语言的设计(Forward maximum matching algorithm for computing Chinese word segmentation.)
猜单词
- 一个简单的智力游戏,游戏者每次只能猜一个字母,如果游戏者猜的字母在单词中,单词中所有的该字母将被视为已猜出,例如:如果原单词是apple,游戏者猜出了p,则程序应显示当前猜出的残缺单词为-pp--;设置选择时间限制或猜错次数限制,超过限制则游戏失败。(In a simple puzzle game, the player can guess only one letter at a time. If the player guesses the letter in the word, all t
ansj_seg-master
- 基于java语言的ansj中文分词程序,适合语义识别学习者研究用(Ansj Chinese word segmentation program based on Java language, which is suitable for semantic recognition learners to study)