文件名称:entropy
-
所属分类:
- 标签属性:
- 上传时间:2012-11-16
-
文件大小:1.76kb
-
已下载:0次
-
提 供 者:
-
相关连接:无下载说明:别用迅雷下载,失败请重下,重下不扣分!
介绍说明--下载内容来自于网络,使用问题请自行百度
The Shannon entropy is a measure of the average information content one is missing when one does not know the value of the random variable. The concept was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".
Shannon s entropy represents an absolute limit on the best possible lossless compression of any communication, under certain constraints: treating messages to be encoded as a sequence of independent and identically-distributed random variables, Shannon s source coding theorem shows that, in the limit, the average length of the shortest possible representation to encode the messages in a given alphabet is their entropy divided by the logarithm of the number of symbols in the target alphabet.
Shannon s entropy represents an absolute limit on the best possible lossless compression of any communication, under certain constraints: treating messages to be encoded as a sequence of independent and identically-distributed random variables, Shannon s source coding theorem shows that, in the limit, the average length of the shortest possible representation to encode the messages in a given alphabet is their entropy divided by the logarithm of the number of symbols in the target alphabet.
(系统自动生成,下载前可以参看下载内容)
下载文件列表
entropy/escorTsallis_entro.m
entropy/K_q_escorTsallis.m
entropy/K_q_renyi.m
entropy/K_q_Tsallis.m
entropy/read me.txt
entropy/renyi_entro.m
entropy/shannon_entro.m
entropy/Tsallis_entro.m
entropy
entropy/K_q_escorTsallis.m
entropy/K_q_renyi.m
entropy/K_q_Tsallis.m
entropy/read me.txt
entropy/renyi_entro.m
entropy/shannon_entro.m
entropy/Tsallis_entro.m
entropy
1999-2046 搜珍网 All Rights Reserved.
本站作为网络服务提供者,仅为网络服务对象提供信息存储空间,仅对用户上载内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。
