Error-free compression techniques usually rely on entropy-based encoding algorithms. The concept of entropy is mathematically described in equation (1):
a j is a symbol produced by the information source
P ( a j ) is the probability of that symbol
J is the total number of different symbols
H ( z ) is the entropy of the source.
The concept of entropy provides an upper bound on how much compression can be achieved, given the probability distribution of the source. In other words, it establishes a theoretical limit on the amount of lossless compression that can be achieved using entropy encoding techniques alone.