搜索资源列表
MFC-Look-it-up-in-the-dictionary
- 查词典、分词、词频统计程序,非常实用读者,建议下载。-Look it up in the dictionary, word segmentation, word frequency statistics program, very practical readers, it is recommended to download.
jieba-0.31
- jeiba分词、中文分词、python、开源-jeiba word, Chinese word segmentation, python, open source
Sina-weibo
- 运行环境为C#+MYSQL,并融合了ICTCLAS分词和TF*PDF算法,能够对采集到的信息,做趋势分析和热点发现等分析;此外,您可以通过调整程序中的正则表达式,以匹配相关代码区域的数据。-Runtime environment for C#+ MYSQL, and the integration of ICTCLAS word and TF* PDF algorithm, able to collect information, analyze trends and hot spots dis
InformationGain
- 使用java实现的信息增益算法,附带了一些训练样本,已经进行了分词-Java algorithm using information gain realized, with some training samples have been carried out participle
maximum_entropy
- 最大熵模型IIS参数估计算法的实现,针对自然语言处理中词位分词的任务设计-IIS for maximum entropy
ikanalyzer
- 测试IKAnalyzer分词器 读取txt文件方法 -Test IKAnalyzer word read txt file method
textmin
- 朴素贝叶斯文本分类代码,带分词。JAVA编写,有注释说明。-text minner tools
pythonsample
- 自然语言处理的例子,可以实现分词和新词发现,中文预处理也支持。用python写的-Examples of natural language processing can be achieved segmentation and new words found Chinese pretreatment also supported. Written with python
AI
- 这是人工智能自然语言分析的基础,中文分词实现代码,适合于人工智能的初学者-This is the basis of analysis of the natural language of artificial intelligence, Chinese word segmentation implementation code, suitable for beginners of artificial intelligence
Participle
- 对分词工具进行了优化,可以很快的完成对文本的分词-For segmentation tools optimized to quickly complete the text word
SharpICTCLAS1.1
- 汉语分词系统,讲一句话中的词语提取出来,如“我们是学生”提取出“我们”,“是”, 学生 -Chinese word segmentation system, speak a word of words extracted...
omp_tokenize
- 用openMP实现的多线程的分词系统,输入是一段文本,输出是分词过的结果集。-OpenMP is used to implement multithreaded word segmentation system, the input is a text, the output is a participle result set.
MMSeg
- 中文自动分词系统,java编写,有界面。可以实现正向最大匹配FMM和逆向最大匹配B-Chinese automatic segmentation system, java write, there are interfaces. You can achieve maximum matching FMM forward and reverse maximum matching BMM
baiduTerm
- 百度百科的术语表,可用于自然语言处理,例如中文分词,命名实体识别,信息抽取等。-baidu terms for natural language processing
xapian
- 使用Xapian基于C++的多媒体平台搜索算法,可对超文本文件进行分词搜索等功能-Use the Xapian search algorithm for multimedia platform based on C++, which can be word search functions of hypertext documents
chinese-segment
- 中文分词开源项目 JAVA中文分词 中文分词开源项目 JAVA中文分词-Chinese word segmentation open source projects JAVA Chinese Chinese word segmentation open source projects JAVA Chinese
BP-neural-network--based-on-Joone
- 基于joone编写的bp神经网络算法,并用到中文分词中,其中以“这支歌太平淡无味了”分词为例, 这支/歌/太/平淡/无味/了 为正确结果,分别获取两者的unicode的二进制形式作为输入样本和期望样本进行训练和测试。-Based joone written bp neural network algorithm, and used in Chinese word, unicode binary form as the input sample and expectations samples f
Chinese-Word-Segmentation
- 很好的中文分词算法,详细介绍请解压后看注释。字典文件也要放在目录下。-Good Chinese word segmentation algorithm, detailed look after unzip comment. But also on the dictionary file directory.
cppjieba-master
- 中文分词功能,采用统计学习算法来实现,准确率较高-Chinese word function, using statistical learning algorithms to achieve high accuracy
HLDLL-for-VB
- 海量分词的for VB版本,值得学习,对中文分词有很高的精度-Massive word for VB version, it is worth learning, have the very high accuracy of Chinese participle