Entropy as an Error Measure
In Shannon's paper A Mathematical Theory of Communication, he represented a communication system using the following schematic: He defined Entropy, a quantity that forms the basis of information theory. Entropy Information Entropy is interpreted in many ways. One way that I like to think about it is in terms of " how much randomness is present in the state-space?" (Similar to Boltzmann's Entropy). It is defined as the following:...