搜索资源列表
百度分词词库
- 据说是百度以前用的中文分词词典,希望对大家有一点帮助哈,快下快下-allegedly Baidu before the Chinese word dictionaries, we hope to have a bit of help to Kazakhstan, where fast under fast!
pymmseg-cpp-win32-1.0.1.tar.gz
- 基于python的中文分词程序,易用性高,可以作为接口直接在python程序中使用,Python-based Chinese word segmentation process, ease of use high, can be used as interface directly in the python program to use
Chinese-Segmentation.rar
- 自己编写的中文分词源程序,用vc++编写,附有完整的文档,以及标准的分词数据库,I have written the source code of the Chinese word segmentation, using vc++ to prepare, with complete documentation, as well as sub-standard speech database
中文分词算法
- 本程序使用给出的字典进行学习并对训练语料进行分词处理,采用C语言编写,高效易懂!
RMM
- 基于rmm算法(逆向最大匹配)实现的中文分词系统,具体内容是一个mfc工程文件。-Rmm-based algorithm (reverse maximum matching) to achieve the Chinese word segmentation system, specific content is a mfc project file.
Source_Code
- 中文分词系统分析库,可以对UTF8表述的中文文字进行分词操作, 支持外挂的词库和手动干预(回调方式)的加权处理。 可用于搜索引擎的基础分词开发-Chinese word segmentation system analysis libraries, can be expressed UTF8 Chinese text segmentation operation, support plug and manual intervention in the thesaurus (callback me
splitword
- 自己写的小分词程序,中文分词测试版,仅供参考,谢谢!-Writing their own small sub-term process, English sub-test version of the word, for reference purposes only, thank you!
Codes_and_Application
- 中科院的的分词工具,应该是分中文用的,效率不错-Chinese Academy of Sciences of the sub-word tools, should be used at the Chinese, the efficiency of a good
VC2010
- VC调用中科院分词2010版组件的最新源码! 中国科学院计算技术研究所在多年研究工作积累的基础上,研制出了汉语词法分析系统ICTCLAS(Institute of Computing Technology, Chinese Lexical Analysis System),主要功能包括中文分词;词性标注;命名实体识别;新词识别;同时支持用户词典。我们先后精心打造五年,内核升级8次,目前已经升级到了ICTCLAS2010!-VC call to Word 2010 version of t
fenci
- 中文分词,可将文件划分词性,有词库,可添加新词-Use LR analysis of the compiler, may realize the words and grammar, semantics analysi
splitword
- 基于VC++6.0的中文分词程序。内含词典。-VC++6.0 based Chinese word segmentation procedure. Embedded dictionary.
1
- 中文分词在中文信息处理中是最最基础的,无论机器翻译亦或信息检索还是其他相关应用,如果涉及中文,都离不开中文分词,因此中文分词具有极高的地位。中文分词入门最简单应该是最大匹配法了-Chinese word segmentation in Chinese information processing is the most basic, whether in machine translation Yihuo information retrieval, or other related appli
ansj_seg-master
- ansj_seg-master中文分词安装包(附带安装说明,详情请点击里面的README.md)(Ansj_seg-master Chinese word segmentation package (attached installation instructions, please click on the details of the README.md))
Desktop
- 可以实现matlab中文分词采用反向最大匹配法可以准确实现分词(Matlab Chinese word segmentation can be achieved, using the reverse maximum matching method, you can accurately achieve word segmentation)
coreseek
- 非常好用的中文分词工具,在网上找了好久才找到的非常好的工具(very good Chinese languag participle tool;I Looking for a very good tool on the Internet for a long time)
ChPreprocess
- 使用jieba包从excel表中读取数据,进行中文分词,预料分析(Using Jieba package for Chinese analysis, expected analysis)
chinese_seg_update
- 中文分词,采用逆向最大匹配方法实现,利用字典作为索引(The Chinese word segmentation is realized by the reverse maximum matching method, and the dictionary is used as the index.)
jieba-jieba3k
- MATLAB 结巴分词的工具包,用于很多中文分词的模式识别代码程序,利用已有函数工具包提高工作效率,内有安装说明(MATLAB jieba toolkit, used for many Chinese word segmentation pattern recognition code programs, using existing function toolkits to improve work efficiency, with installation instructions)
wordseg
- 运用R语言进行中文分词处理,得到词频统计,并绘制词云图直观表示(Chinese word segmentation and word cloud drawing)
Python逆向最大匹配实现-代码-文档-截图等
- 对于文本进行自然语言处理,中文分词。北邮计算机院NLP课的作业。(This is a homework for BUPT student, implementing the word segment of natural language processing.)