In information theory, redundancy measures the fractional difference between the entropy H(X) of an ensemble X, and its maximum possible value log ( | A X | ) {\displaystyle \log(|{\mathcal {A}}_{X}|)} . Informally, it is the amount of wasted "space" used to transmit certain data. Data compression is a way to reduce or eliminate unwanted redundancy, while forward error correction is a way of adding desired redundancy for purposes of error detection and correction when communicating over a noisy channel of limited capacity.