2014ACM/ICPC亚洲区域赛牡丹江站现场赛-I ( ZOJ 3827 ) Information Entropy
Information Entropy
Time Limit: 2 Seconds Memory Limit: 65536 KB Special Judge
Information Theory is one of the most popular courses in Marjar University. In this course, there is an important chapter about information entropy.
Entropy is the average amount of information contained in each message received. Here, a message stands for an event, or a sample or a character drawn from a distribution or a data stream.
Entropy thus characterizes our uncertainty about our source of information. The source is also characterized by the probability distribution of the samples drawn from it. The idea here is that the less likely an event is, the more information it provides when
it occurs.
Generally, "entropy" stands for "disorder" or uncertainty. The entropy we talk about here was introduced by Claude E. Shannon in his 1948 paper "A Mathematical Theory of Communication".
We also call it Shannon entropy or information entropy to distinguish from other occurrences of the term, which appears in various parts of physics in different forms.
Named after Boltzmann's H-theorem, Shannon defined the entropy Η (Greek letter Η, η) of a discrete random variable X with possible values {x1, x2,
..., xn} and probability mass functionP(X) as:
Here E is the expected value operator. When taken from a finite sample, the entropy can explicitly be written as
Where b is the base of the logarithm used. Common values of b are 2, Euler's number e, and 10. The unit of entropy is bit for b = 2, nat for b = e,
and dit (or digit) for b = 10 respectively.
In the case of P(xi) = 0 for some i, the value of the corresponding summand 0 logb(0) is taken to be a well-known limit:
Your task is to calculate the entropy of a finite sample with N values.
Input
There are multiple test cases. The first line of input contains an integer T indicating the number of test cases. For each test case:
The first line contains an integer N (1 <= N <= 100) and a string S. The string S is one of "bit", "nat" or "dit", indicating the unit of entropy.
In the next line, there are N non-negative integers P1, P2, .., PN. Pi means the probability
of the i-th value in percentage and the sum of Pi will be 100.
Output
For each test case, output the entropy in the corresponding unit.
Any solution with a relative or absolute error of at most 10-8 will be accepted.
Sample Input
3
3 bit
25 25 50
7 nat
1 2 4 8 16 32 37
10 dit
10 10 10 10 10 10 10 10 10 10
Sample Output
1.500000000000
1.480810832465
1.000000000000
Author: ZHOU, Yuchen
Source: The 2014 ACM-ICPC Asia Mudanjiang Regional Contest
题目链接: problemId=5381">Information Entropy
解题思路:又是一道签到题!!!当时想多了。竟然纠结在积分上,果真是想多了,根本不须要什么技术,就是依照那个数学公式算。又学到了个小方法。表示对数能够用log(m, n)= log(n) / log(m). 还有就是e = exp(1)。
还有就是注意当p = 0时,函数值为0。
AC代码:
#include <stdio.h>
#include <string.h>
#include <iostream>
#include <algorithm>
#include <vector>
#include <queue>
#include <set>
#include <map>
#include <string>
#include <math.h>
#include <stdlib.h>
#include <time.h>
using namespace std;
#define INF 0x7fffffff #define e exp(1.0) int main()
{
#ifdef sxk
freopen("in.txt","r",stdin);
#endif
int T, n, p;
string s;
scanf("%d",&T);
while(T--)
{
double ans = 0;
scanf("%d",&n);
cin>>s;
if(s == "bit"){
for(int i=0; i<n; i++){
scanf("%d", &p);
if(p)
ans += -(p/100.0*log(p/100.0)/log(2));
}
}
else if(s == "nat"){
for(int i=0; i<n; i++){
scanf("%d", &p);
if(p)
ans += -(p/100.0*log(p/100.0)/log(e));
}
}
else{
for(int i=0; i<n; i++){
scanf("%d", &p);
if(p)
ans += -(p/100.0*log(p/100.0)/log(10));
}
}
printf("%.14lf\n",ans);
}
return 0;
}
2014ACM/ICPC亚洲区域赛牡丹江站现场赛-I ( ZOJ 3827 ) Information Entropy的更多相关文章
- 2014ACM/ICPC亚洲区域赛牡丹江站现场赛-A ( ZOJ 3819 ) Average Score
Average Score Time Limit: 2 Seconds Memory Limit: 65536 KB Bob is a freshman in Marjar Universi ...
- 2014ACM/ICPC亚洲区域赛牡丹江站现场赛-K ( ZOJ 3829 ) Known Notation
Known Notation Time Limit: 2 Seconds Memory Limit: 65536 KB Do you know reverse Polish notation ...
- ZOJ 3827 Information Entropy(数学题 牡丹江现场赛)
题目链接:http://acm.zju.edu.cn/onlinejudge/showProblem.do? problemId=5381 Information Theory is one of t ...
- 2014ACM/ICPC亚洲区域赛牡丹江站汇总
球队内线我也总水平,这所学校得到了前所未有的8地方,因为只有两个少年队.因此,我们13并且可以被分配到的地方,因为13和非常大的数目.据领队谁oj在之上a谁去让更多的冠军.我和tyh,sxk,doub ...
- 2014ACM/ICPC亚洲区域赛牡丹江现场赛总结
不知道怎样说起-- 感觉还没那个比赛的感觉呢?如今就结束了. 9号.10号的时候学校还评比国奖.励志奖啥的,由于要来比赛,所以那些事情队友的国奖不能答辩.自己的励志奖班里乱搞要投票,自己又不在,真是无 ...
- 2014 牡丹江现场赛 i题 (zoj 3827 Information Entropy)
I - Information Entropy Time Limit:2000MS Memory Limit:65536KB 64bit IO Format:%lld & %l ...
- ZOJ 3827 Information Entropy (2014牡丹江区域赛)
题目链接:ZOJ 3827 Information Entropy 依据题目的公式算吧,那个极限是0 AC代码: #include <stdio.h> #include <strin ...
- 2015 ACM / ICPC 亚洲区域赛总结(长春站&北京站)
队名:Unlimited Code Works(无尽编码) 队员:Wu.Wang.Zhou 先说一下队伍:Wu是大三学长:Wang高中noip省一:我最渣,去年来大学开始学的a+b,参加今年区域赛之 ...
- 2013ACM/ICPC亚洲区南京站现场赛---Poor Warehouse Keeper(贪心)
题目链接 http://acm.hdu.edu.cn/showproblem.php?pid=4803 Problem Description Jenny is a warehouse keeper. ...
随机推荐
- 关于DTCC数据库技术大会
本次DTCC数据库技术大会是第9届了,这次大会虽然有不少公司的产品推介,总体来说还是有不少干货的. 专场较多,有选择地主要听了大数据实践跟流式计算这块.网易跟滴滴的分享比较不错. 了解到了现在大家是用 ...
- 剑指Offer整理笔记
说在前面,本篇的目的是为了学习剑指offer,以及博客园的排版功能,并将文章排版得整洁得体. 梵蒂冈梵蒂冈地方官方
- 7 SQL 集合运算
7 集合运算 7-1 表的加减法 本章将会和大家一起学习“集合运算”操作.在数学领域,“集合”表示“(各种各样的)事物的总和”:在数据库领域,表示“记录的集合”.具体来说,表.视图和查询的执行结果都是 ...
- NowCoder小杰的签到题(模拟,思维)
链接: https://www.nowcoder.com/acm/contest/52/M 题意: 给定n个队伍的到场时间, 有3个报道位, 每个队伍报道需要b时间, 求所有报道完成的时间. 分析: ...
- mysql数据库中的索引有那些、有什么用
本文主要讲述了如何加速动态网站的MySQL索引分析和优化. 一.什么是索引? 索引用来快速地寻找那些具有特定值的记录,所有MySQL索引都以B-树的形式保存.如果没有索引,执行查询时MySQL必须从第 ...
- [转]Selenium-Webdriver系列Python版教程(1)————快速开始
elenium的历史,selenium2与WebDriver的关系本文就不讲了,想了解的同学们百度一下就可以Ok. 本系列教程是以Selenium-WebDriver的Python版本,首先从 ...
- python的unittest单元测试框架断言整理汇总
自动化脚本最重要的是断言,正确设置断言以后才能帮助我们判断测试用例执行结果. 一.先说说unittest常用的断言吧 常用的就以下几个,网上一搜一大堆.python版本2.7以上都可以调用了. 断言语 ...
- Linux如何查看CPU负载
负载(load)是Linux机器的一个重要指标,直观了反应了机器当前的状态.如果机器负载过高,那么对机器的操作将难以进行. linux的负载高,主要是由于CPU使用.内存使用.IO消耗三部分构成.任意 ...
- xtu summer individual 2 D - Colliders
Colliders Time Limit: 2000ms Memory Limit: 262144KB This problem will be judged on CodeForces. Origi ...
- C 题 KMP中next[]问题
题目大意: 找到能够进行字符串匹配的前缀 这题只要一直求next,直到next为0停止,记得答案是总长减去next的长度 #include <iostream> #include < ...