It is very interesting that data can be compressed because it insinuates that information that we generally pass on to each other can be said in shorter information units (infos). Instead of saying Ëœyesâ„¢ a simple nod can do the same work, but by transmitting lesser data. In his 1948 paper, A Mathematical Theory of Communication, Shannon established that there is a fundamental limit to lossless data compression. This limit, called the entropy rate, is denoted by H. The exact value of H depends on the information source --- more specifically, the statistical nature of the source. It is possible to compress the source, in a lossless manner, with compression rate close to H. It is mathematically impossible to do better than H.
Download Seminar Report