Entropy
Entropy |
Time Limit: 2000/1000 MS (Java/Others) Memory Limit: 65536/32768 K (Java/Others) |
Total Submission(s): 373 Accepted Submission(s): 208 |
Problem Description
An entropy encoder is a data encoding method that achieves lossless data compression by encoding a message with “wasted” or “extra” information removed. In other words, entropy encoding removes information that was not necessary in the first place to accurately encode the message. A high degree of entropy implies a message with a great deal of wasted information; english text encoded in ASCII is an example of a message type that has very high entropy. Already compressed messages, such as JPEG graphics or ZIP archives, have very little entropy and do not benefit from further attempts at entropy encoding.
English text encoded in ASCII has a high degree of entropy because all characters are encoded using the same number of bits, eight. It is a known fact that the letters E, L, N, R, S and T occur at a considerably higher frequency than do most other letters in english text. If a way could be found to encode just these letters with four bits, then the new encoding would be smaller, would contain all the original information, and would have less entropy. ASCII uses a fixed number of bits for a reason, however: it’s easy, since one is always dealing with a fixed number of bits to represent each possible glyph or character. How would an encoding scheme that used four bits for the above letters be able to distinguish between the four-bit codes and eight-bit codes? This seemingly difficult problem is solved using what is known as a “prefix-free variable-length” encoding. In such an encoding, any number of bits can be used to represent any glyph, and glyphs not present in the message are simply not encoded. However, in order to be able to recover the information, no bit pattern that encodes a glyph is allowed to be the prefix of any other encoding bit pattern. This allows the encoded bitstream to be read bit by bit, and whenever a set of bits is encountered that represents a glyph, that glyph can be decoded. If the prefix-free constraint was not enforced, then such a decoding would be impossible. Consider the text “AAAAABCD”. Using ASCII, encoding this would require 64 bits. If, instead, we encode “A” with the bit pattern “00”, “B” with “01”, “C” with “10”, and “D” with “11” then we can encode this text in only 16 bits; the resulting bit pattern would be “0000000000011011”. This is still a fixed-length encoding, however; we’re using two bits per glyph instead of eight. Since the glyph “A” occurs with greater frequency, could we do better by encoding it with fewer bits? In fact we can, but in order to maintain a prefix-free encoding, some of the other bit patterns will become longer than two bits. An optimal encoding is to encode “A” with “0”, “B” with “10”, “C” with “110”, and “D” with “111”. (This is clearly not the only optimal encoding, as it is obvious that the encodings for B, C and D could be interchanged freely for any given encoding without increasing the size of the final encoded message.) Using this encoding, the message encodes in only 13 bits to “0000010110111”, a compression ratio of 4.9 to 1 (that is, each bit in the final encoded message represents as much information as did 4.9 bits in the original encoding). Read through this bit pattern from left to right and you’ll see that the prefix-free encoding makes it simple to decode this into the original text even though the codes have varying bit lengths. As a second example, consider the text “THE CAT IN THE HAT”. In this text, the letter “T” and the space character both occur with the highest frequency, so they will clearly have the shortest encoding bit patterns in an optimal encoding. The letters “C”, “I’ and “N” only occur once, however, so they will have the longest codes. There are many possible sets of prefix-free variable-length bit patterns that would yield the optimal encoding, that is, that would allow the text to be encoded in the fewest number of bits. One such optimal encoding is to encode spaces with “00”, “A” with “100”, “C” with “1110”, “E” with “1111”, “H” with “110”, “I” with “1010”, “N” with “1011” and “T” with “01”. The optimal encoding therefore requires only 51 bits compared to the 144 that would be necessary to encode the message with 8-bit ASCII encoding, a compression ratio of 2.8 to 1. |
Input
The input file will contain a list of text strings, one per line. The text strings will consist only of uppercase alphanumeric characters and underscores (which are used in place of spaces). The end of the input will be signalled by a line containing only the word “END” as the text string. This line should not be processed.
|
Output
For each text string in the input, output the length in bits of the 8-bit ASCII encoding, the length in bits of an optimal prefix-free variable-length encoding, and the compression ratio accurate to one decimal point.
|
Sample Input
AAAAABCD |
Sample Output
64 13 4.9 |
Source
Greater New York 2000
|
/*
漏了只有一种字符的情况,傻傻的想了一个小时
*/
#include<bits/stdc++.h>
using namespace std;
priority_queue<int,vector<int>,greater<int> >q;
int vis[];
char str[];
int main()
{
//freopen("C:\\Users\\acer\\Desktop\\in.txt","r",stdin);
while(gets(str)&&strcmp(str,"END"))
{
memset(vis,,sizeof vis);
while(!q.empty()) q.pop();
int m=strlen(str);
//cout<<str<<endl;
for(int i=;i<m;i++)
{
if(str[i]=='_')
vis[]++;
else
vis[str[i]-'A'+]++;
}
for(int i=;i<;i++)
{
//cout<<vis[i]<<endl;
if(vis[i])
q.push(vis[i]);
}
int cur=;
if(q.size()==)
cur=m;
else
{
while(q.size()>)
{
int one=q.top();q.pop();
int two=q.top();q.pop();
cur+=(one+two);
//cout<<cur<<endl;
q.push(one+two);
}
}
printf("%d %d %.1lf\n",m*,cur,m*8.0/cur);
}
return ;
}
Entropy的更多相关文章
- 一个python的计算熵(entropy)的函数
计算熵的函数: # -*- coding: utf-8 -*- import math #the function to calculate entropy, you should use the p ...
- Shannon entropy
Shannon entropy is one of the most important metrics in information theory. Entropy measures the unc ...
- [转载] Calculating Entropy
From: johndcook.com/blog For a set of positive probabilities pi summing to 1, their entropy is defi ...
- paper 38 :entropy
图像熵计算 真是为了一个简单的基础概念弄的心力交瘁,请教了一下师姐,但是并没有真的理解,师弟我太笨呀~~所以,我又查熵的中文含义和相关的出处!共勉吧~~ 1.信息熵: 利用信息论中信息熵概念,求出任意 ...
- HM中再增加一路自己的entropy coder
compressSlice 中一开始的entropy coder 设置: // set entropy coder if( m_pcCfg->getUseSBACRD() ) { m_pcSba ...
- 论文笔记之:Multiple Feature Fusion via Weighted Entropy for Visual Tracking
Multiple Feature Fusion via Weighted Entropy for Visual Tracking ICCV 2015 本文主要考虑的是一个多特征融合的问题.如何有效的进 ...
- hdu 1053 Entropy
题目连接 http://acm.hdu.edu.cn/showproblem.php?pid=1053 Entropy Description An entropy encoder is a data ...
- 最大熵模型 Maximum Entropy Model
熵的概念在统计学习与机器学习中真是很重要,熵的介绍在这里:信息熵 Information Theory .今天的主题是最大熵模型(Maximum Entropy Model,以下简称MaxEnt),M ...
- ZOJ 3827 Information Entropy 水题
Information Entropy Time Limit: 1 Sec Memory Limit: 256 MB 题目连接 http://acm.zju.edu.cn/onlinejudge/sh ...
- 2014 牡丹江现场赛 i题 (zoj 3827 Information Entropy)
I - Information Entropy Time Limit:2000MS Memory Limit:65536KB 64bit IO Format:%lld & %l ...
随机推荐
- 西邮linux兴趣小组2014纳新免试题(四)
[第四关] 题目 http://findakey.sinaapp.com/ Example: String1:FFFF8 5080D D0807 9CBFC E4A04 24BC6 6C840 49B ...
- 新书发布《每天5分钟玩转Docker容器技术》
后台不时收到关于纸质版教程书籍的询问,今天终于可以给大家一个交代了. <每天5分钟玩转Docker容器技术>现已在各大书城上架. 比较了一下,目前京东上最实惠:https://item.j ...
- Docker入门之三容器
上一篇博客学习了下镜像,今天来学习容器.容器类似一个手机中的沙盒环境,用来运行app实例.和镜像一样也是对容器的创建.删除.导出等. 由于我买的参考书中的例子好多都是基于linux的,所以我将dock ...
- 【转】elasticsearch的查询器query与过滤器filter的区别
很多刚学elasticsearch的人对于查询方面很是苦恼,说实话es的查询语法真心不简单- 当然你如果入门之后,会发现elasticsearch的rest api设计是多么有意思. 说正题,ela ...
- centos 6.5系统下安装ibus及设置开机自启动
先说一下系统环境:centos 6.5,然后我是以root身份执行的,没有权限的用户参见sudo用法 第1步:查找并安装ibus安装包,命令如下: 找到一行: ibus-pinyin.x86_64 : ...
- 每周分享之 二 http协议(1)
本次分享http协议,共分为三部分,这是第一部分,主要讲解http的发展历程,各个版本,以及各个版本的特点. 一:http/0.9 最早版本是1991年发布的0.9版.该版本极其简单,只有一个命令GE ...
- Stochastic Gradient Descent
一.从Multinomial Logistic模型说起 1.Multinomial Logistic 令为维输入向量; 为输出label;(一共k类); 为模型参数向量: Multinomial Lo ...
- Java面向对象 网络编程 下
Java面向对象 网络编程 下 知识概要: (1)Tcp 练习 (2)客户端向服务端上传一个图片. (3) 请求登陆 (4)url 需求:上传图片. 客户端: ...
- DevOps之归纳总结
唠叨话 关于德语关我屁事的知识点,仅提供精华汇总,具体知识点细节,参考教程网址,如需帮助,请留言. DevOps归纳总结 <DevOps功能与性能>浏览器(饼干Cookie.会话Sessi ...
- UWP 绘制图形
UWP图形和wpf变化不多 一般用到有椭圆.长方形.多边形.线 不过如果用的好,可以做出很漂亮的界面 一般使用画图都是使用Shape 类,Shape 类具有一个与其关联的画笔并可以呈现到屏幕,包括 L ...