In information theory, joint entropy is a measure of the uncertainty associated with a set of variables.