In information theory, the conditional entropy quantifies the amount of information needed to describe the outcome of a random variable Y {\displaystyle Y} given that the value of another random variable X {\displaystyle X} is known. Here, information is measured in shannons, nats, or hartleys. The entropy of Y {\displaystyle Y} conditioned on X {\displaystyle X} is written as H ( Y | X ) {\displaystyle \mathrm {H} (Y|X)} .