From:  johndcook.com/blog

For a set of positive probabilities pi summing to 1, their entropy is defined as

(For this post, log will mean log base 2, not natural log.)

This post looks at a couple questions about computing entropy. First, are there any numerical problems computing entropy directly from the equation above?

Second, imagine you don’t have the pi values directly but rather counts ni that sum to N. Then pi = ni/N. To apply the equation directly, you’d first need to compute N, then go back through the data again to compute entropy. If you have a large amount of data, could you compute the entropy in one pass?

To address the second question, note that

so you can sum ni and ni log ni in the same pass.

One of the things you learn in numerical analysis is to look carefully at subtractions. Subtracting two nearly equal numbers can result in a loss of precision. Could the numbers above be nearly equal? Maybe if the ni are ridiculously large. Not just astronomically large — astronomically large numbers like the number of particles in the universe are fine — but ridiculously large, numbers whose logarithms approach the limits of machine-representable numbers. (If we’re only talking about numbers as big as the number of particles in the universe, their logs will be at most three-digit numbers).

Now to the problem of computing the sum of ni log ni. Could the order of the terms matter? This also applies to the first question of the post if we look at summing the pi log pi. In general, you’ll get better accuracy summing a lot positive numbers by sorting them and adding from smallest to largest and worse accuracy by summing largest to smallest. If adding a sorted list gives essentially the same result when summed in either direction, summing the list in any other order should too.

To test the methods discussed here, I used two sets of count data, one on the order of a million counts and the other on the order of a billion counts. Both data sets had approximately a power law distribution with counts varying over seven or eight orders of magnitude. For each data set I computed the entropy four ways: two equations times two orders. I convert the counts to probabilities and use the counts directly, and I sum smallest to largest and largest to smallest.

For the smaller data set, all four methods produced the same answer to nine significant figures. For the larger data set, all four methods produced the same answer to seven significant figures. So at least for the kind of data I’m looking at, it doesn’t matter how you calculate entropy, and you might as well use the one-pass algorithm to get the result faster.

[转载] Calculating Entropy的更多相关文章

  1. 【转载】7 Steps for Calculating the Largest Lyapunov Exponent of Continuous Systems

    原文地址:http://sprott.physics.wisc.edu/chaos/lyapexp.htm The usual test for chaos is calculation of the ...

  2. [转载] TLS协议分析 与 现代加密通信协议设计

    https://blog.helong.info/blog/2015/09/06/tls-protocol-analysis-and-crypto-protocol-design/?from=time ...

  3. 转载:scikit-learn学习之决策树算法

    版权声明:<—— 本文为作者呕心沥血打造,若要转载,请注明出处@http://blog.csdn.net/gamer_gyt <—— 目录(?)[+] ================== ...

  4. 【转载】计算机视觉(CV)前沿国际国内期刊与会议

    计算机视觉(CV)前沿国际国内期刊与会议这里的期刊大部分都可以通过上面的专家们的主页间接找到1.国际会议 2.国际期刊 3.国内期刊 4.神经网络 5.CV 6.数字图象 7.教育资源,大学 8.常见 ...

  5. 【转载】nginx 并发数问题思考:worker_connections,worker_processes与 max clients

    注:这个文章主要是作者一直在研究nginx作为http server和反向代理服务器时候所谓最大的max_clients和 worker_connections的计算公式, 其实最后的结论也没有卡上公 ...

  6. ZOJ3827 ACM-ICPC 2014 亚洲区域赛的比赛现场牡丹江I称号 Information Entropy 水的问题

    Information Entropy Time Limit: 2 Seconds      Memory Limit: 131072 KB      Special Judge Informatio ...

  7. 最大似然估计 (Maximum Likelihood Estimation), 交叉熵 (Cross Entropy) 与深度神经网络

    最近在看深度学习的"花书" (也就是Ian Goodfellow那本了),第五章机器学习基础部分的解释很精华,对比PRML少了很多复杂的推理,比较适合闲暇的时候翻开看看.今天准备写 ...

  8. 基于Spark自动扩展scikit-learn (spark-sklearn)(转载)

    转载自:https://blog.csdn.net/sunbow0/article/details/50848719 1.基于Spark自动扩展scikit-learn(spark-sklearn)1 ...

  9. softmax,softmax loss和cross entropy的区别

     版权声明:本文为博主原创文章,未经博主允许不得转载. https://blog.csdn.net/u014380165/article/details/77284921 我们知道卷积神经网络(CNN ...

随机推荐

  1. 基于VC的声音文件操作(一)

    (一)文件格式 1.RIFF文件结构和WAVE文件格式 Windows支持两种RIFF(Resource Interchange File Format,"资源交互文件格式")格式 ...

  2. Remote Desktop Connection from Windows 7 to Ubuntu 12.04

    $sudo apt-get install xrdp $cd ~ $sudo vim .xsession gnome-session --session=ubuntu-2d 在windows下进行远程 ...

  3. ManyToMany【项目随笔】关于异常object references an unsaved transient instance

    在保存ManyToMany  时出现异常: org.springframework.dao.InvalidDataAccessApiUsageException: org.hibernate.Tran ...

  4. PyPy 2.1 正式版发布

    PyPy 2.1 Beta1 才刚刚在2天前发布,今天 PyPy 宣布 2.1 正式版发布. 值得关注的改进内容有: JIT support for ARM, architecture version ...

  5. 在线教学、视频会议 Webus Fox(2) 服务端开发手册

    上次在<在线教学.视频会议软件 Webus Fox(1)文本.语音.视频聊天及电子白板基本用法>里介绍了软件的基本用法.本文主要介绍服务器端如何配置.开发. 1. 配置 1.1 IIS配置 ...

  6. HTML5服务器端推送事件 解决PHP微信墙推送问题

    问题描述 以前的文章中<PHP微信墙制作,开源>已经用PHP搭建了一个微信墙获取信息的服务器,然后我就在想推送技术应该怎么解决,上一篇已经用了.NET 的signalr做了一个微信墙,PH ...

  7. 基于 IdentityServer3 实现 OAuth 2.0 授权服务【客户端模式(Client Credentials Grant)】

    github:https://github.com/IdentityServer/IdentityServer3/ documentation:https://identityserver.githu ...

  8. 年终知识分享——UML、设计模式、设计原则

                                                                                                        ...

  9. Scala:条件表达式的好处

    条件表达式的好处之一是:让代码更简洁,例如在一个需要根据不同条件收集不同值的场景中,多数语言提供的代码如下: ; ) { tmp = xxx; } ) { tmp = yyy; } else { tm ...

  10. freshcodecolor纯正则实现的在线代码着色(高亮)

    小菜最新完成的一款在线代码着色工具-freshcodecolor,该工具采用Javascript编写,着色识别策略完全采用正则表达式,无奈正则表达式在Javascript中有很大局限性,导致某些场合识 ...