paper 22:kl-divergence(KL散度)实现代码
这个函数很重要:
function KL = kldiv(varValue,pVect1,pVect2,varargin)
%KLDIV Kullback-Leibler or Jensen-Shannon divergence between two distributions.
% KLDIV(X,P1,P2) returns the Kullback-Leibler divergence between two
% distributions specified over the M variable values in vector X. P1 is a
% length-M vector of probabilities representing distribution 1, and P2 is a
% length-M vector of probabilities representing distribution 2. Thus, the
% probability of value X(i) is P1(i) for distribution 1 and P2(i) for
% distribution 2. The Kullback-Leibler divergence is given by:
%
% KL(P1(x),P2(x)) = sum[P1(x).log(P1(x)/P2(x))]
%
% If X contains duplicate values, there will be an warning message, and these
% values will be treated as distinct values. (I.e., the actual values do
% not enter into the computation, but the probabilities for the two
% duplicate values will be considered as probabilities corresponding to
% two unique values.) The elements of probability vectors P1 and P2 must
% each sum to 1 +/- .00001.
%
% A "log of zero" warning will be thrown for zero-valued probabilities.
% Handle this however you wish. Adding 'eps' or some other small value
% to all probabilities seems reasonable. (Renormalize if necessary.)
%
% KLDIV(X,P1,P2,'sym') returns a symmetric variant of the Kullback-Leibler
% divergence, given by [KL(P1,P2)+KL(P2,P1)]/2. See Johnson and Sinanovic
% (2001).
%
% KLDIV(X,P1,P2,'js') returns the Jensen-Shannon divergence, given by
% [KL(P1,Q)+KL(P2,Q)]/2, where Q = (P1+P2)/2. See the Wikipedia article
% for "Kullback朙eibler divergence". This is equal to 1/2 the so-called
% "Jeffrey divergence." See Rubner et al. (2000).
%
% EXAMPLE: Let the event set and probability sets be as follow:
% X = [1 2 3 3 4]';
% P1 = ones(5,1)/5;
% P2 = [0 0 .5 .2 .3]' + eps;
%
% Note that the event set here has duplicate values (two 3's). These
% will be treated as DISTINCT events by KLDIV. If you want these to
% be treated as the SAME event, you will need to collapse their
% probabilities together before running KLDIV. One way to do this
% is to use UNIQUE to find the set of unique events, and then
% iterate over that set, summing probabilities for each instance of
% each unique event. Here, we just leave the duplicate values to be
% treated independently (the default):
% KL = kldiv(X,P1,P2);
% KL =
% 19.4899
%
% Note also that we avoided the log-of-zero warning by adding 'eps'
% to all probability values in P2. We didn't need to renormalize
% because we're still within the sum-to-one tolerance.
%
% REFERENCES:
% 1) Cover, T.M. and J.A. Thomas. "Elements of Information Theory," Wiley,
% 1991.
% 2) Johnson, D.H. and S. Sinanovic. "Symmetrizing the Kullback-Leibler
% distance." IEEE Transactions on Information Theory (Submitted).
% 3) Rubner, Y., Tomasi, C., and Guibas, L. J., 2000. "The Earth Mover's
% distance as a metric for image retrieval." International Journal of
% Computer Vision, 40(2): 99-121.
% 4) Kullback朙eibler divergence. Wikipedia, The Free Encyclopedia.
%
% See also: MUTUALINFO, ENTROPY
if ~isequal(unique(varValue),sort(varValue)),
warning('KLDIV:duplicates','X contains duplicate values. Treated as distinct values.')
end
if ~isequal(size(varValue),size(pVect1)) || ~isequal(size(varValue),size(pVect2)),
error('All inputs must have same dimension.')
end
% Check probabilities sum to 1:
if (abs(sum(pVect1) - 1) > .00001) || (abs(sum(pVect2) - 1) > .00001),
error('Probablities don''t sum to 1.')
end
if ~isempty(varargin),
switch varargin{1},
case 'js',
logQvect = log2((pVect2+pVect1)/2);
KL = .5 * (sum(pVect1.*(log2(pVect1)-logQvect)) + ...
sum(pVect2.*(log2(pVect2)-logQvect)));
case 'sym',
KL1 = sum(pVect1 .* (log2(pVect1)-log2(pVect2)));
KL2 = sum(pVect2 .* (log2(pVect2)-log2(pVect1)));
KL = (KL1+KL2)/2;
otherwise
error(['Last argument' ' "' varargin{1} '" ' 'not recognized.'])
end
else
KL = sum(pVect1 .* (log2(pVect1)-log2(pVect2)));
end
paper 22:kl-divergence(KL散度)实现代码的更多相关文章
- python 3计算KL散度(KL Divergence)
KL DivergenceKL( Kullback–Leibler) Divergence中文译作KL散度,从信息论角度来讲,这个指标就是信息增益(Information Gain)或相对熵(Rela ...
- paper 23 :Kullback–Leibler divergence KL散度(2)
Kullback–Leibler divergence KL散度 In probability theory and information theory, the Kullback–Leibler ...
- 熵(Entropy),交叉熵(Cross-Entropy),KL-松散度(KL Divergence)
1.介绍: 当我们开发一个分类模型的时候,我们的目标是把输入映射到预测的概率上,当我们训练模型的时候就不停地调整参数使得我们预测出来的概率和真是的概率更加接近. 这篇文章我们关注在我们的模型假设这些类 ...
- [转]熵(Entropy),交叉熵(Cross-Entropy),KL-松散度(KL Divergence)
https://www.cnblogs.com/silent-stranger/p/7987708.html 1.介绍: 当我们开发一个分类模型的时候,我们的目标是把输入映射到预测的概率上,当我们训练 ...
- [学习笔记] Uplift Decision Tree With KL Divergence
Uplift Decision Tree With KL Divergence Intro Uplift model 我没找到一个合适的翻译,这方法主要应用是,探究用户在给予一定激励之后的表现,也就是 ...
- KL divergence
Kullback-Leibler divergence 形式: 性质: 非负 P=Q时,D[P||Q]=0 不对称性:D(P||Q)≠D(Q||P) 自信息:符合分布 P 的某一事件 x 出现,传达这 ...
- KL与JS散度学习[转载]
转自:https://www.jianshu.com/p/43318a3dc715?from=timeline&isappinstalled=0 https://blog.csdn.net/e ...
- [Bayes] KL Divergence & Evidence Lower Bound
L lower是什么? L lower, 既然大于,那么多出来的这部分是什么?如下推导: 得出了KL的概念,同时也自然地引出了latent variable q.
- java 22 - 21 多线程之多线程的代码实现方式3
JDK5新增了一个Executors工厂类来产生线程池,有如下几个方法 A.public static ExecutorService newCachedThreadPool() B.public s ...
随机推荐
- 公共POI导出Excel方法--java
最早开始的时候做过一些数据Excel导出的功能,但是到后期每一次导出都需要写一些差不多类似的代码,稍微研究了一下写了个公共的导出方法. 这里用的是POI,然后写成了一个公共类,传入设置好格式的数据,就 ...
- SQl中Left Join 、Right Join 、Inner Join与Ful Join
1 left join 左外连接:查询结果以左表数据为准.假如左表有四条数据,右表有三条数据,则查询结果为四条,且都是左表中有的数据. 例如: EMP表: SAL表: 左连接 左连接,表EMP是主表, ...
- iOS 并发编程之 Operation Queues
现如今移动设备也早已经进入了多核心 CPU 时代,并且随着时间的推移,CPU 的核心数只会增加不会减少.而作为软件开发者,我们需要做的就是尽可能地提高应用的并发性,来充分利用这些多核心 CPU 的性能 ...
- Apple Developer Program Roles Overview
Apple Developer Program Roles Overview There are three roles that can be assigned to Apple Developer ...
- each的用法
$(selector).each(function(index,element)) function(index,element) 必需.为每个匹配元素规定运行的函数. index - 选择器的 in ...
- WeUI—微信官方UI库
WeUI 为微信 Web 服务量身设计 概述 WeUI是一套同微信原生视觉体验一致的基础样式库,由微信官方设计团队为微信 Web 开发量身设计,可以令用户的使用感知更加统一.包含button.cell ...
- Windows 上使用 cygwin 连接到 docker toolbox
Windows 上使用 cygwin 连接到 docker toolbox Docker 确实给软件开发带来一些好处,在简化部署.统一开发.测试和生产环境上,有它独到的理念.Linux 上可直接安装 ...
- Jboss 安全和优化
一. Jboss后台启动:添加后台修改命令:vi run.shwhile true; do if [ "x$LAUNCH_JBOSS_IN_BACKGROUND" ...
- SQL Server字符串函数(超实用)
1. len():计算字符串长度 2. lower().upper():字符串转换为大.小写 3. ltrim().rtrim():截去字符串左.右侧空格 4. space():返回由重复的空格组成的 ...
- javascript实例学习之二——类新浪微博的输入框
该案例实现如下效果,具体可见新浪微博网站的微博发布框 实现 以下效果效果1:当光标移入文本框时,文本框上方的文字发生变化,显示剩余可以输入的字数,当光标移出文本框,并且文本框中没有任何输入时,恢复最初 ...