Menu
Home Explore People Places Arts History Plants & Animals Science Life & Culture Technology
On this page
Quantum mutual information
Measure in quantum information theory

In quantum information theory, quantum mutual information, or von Neumann mutual information, after John von Neumann, is a measure of correlation between subsystems of quantum state. It is the quantum mechanical analog of Shannon mutual information.

We don't have any images related to Quantum mutual information yet.
We don't have any YouTube videos related to Quantum mutual information yet.
We don't have any PDF documents related to Quantum mutual information yet.
We don't have any Books related to Quantum mutual information yet.
We don't have any archived web articles related to Quantum mutual information yet.

Motivation

For simplicity, it will be assumed that all objects in the article are finite-dimensional.

The definition of quantum mutual entropy is motivated by the classical case. For a probability distribution of two variables p(x, y), the two marginal distributions are

p ( x ) = ∑ y p ( x , y ) , p ( y ) = ∑ x p ( x , y ) . {\displaystyle p(x)=\sum _{y}p(x,y),\qquad p(y)=\sum _{x}p(x,y).}

The classical mutual information I(X:Y) is defined by

I ( X : Y ) = S ( p ( x ) ) + S ( p ( y ) ) − S ( p ( x , y ) ) {\displaystyle I(X:Y)=S(p(x))+S(p(y))-S(p(x,y))}

where S(q) denotes the Shannon entropy of the probability distribution q.

One can calculate directly

S ( p ( x ) ) + S ( p ( y ) ) = − ( ∑ x p x log ⁡ p ( x ) + ∑ y p y log ⁡ p ( y ) ) = − ( ∑ x ( ∑ y ′ p ( x , y ′ ) log ⁡ ∑ y ′ p ( x , y ′ ) ) + ∑ y ( ∑ x ′ p ( x ′ , y ) log ⁡ ∑ x ′ p ( x ′ , y ) ) ) = − ( ∑ x , y p ( x , y ) ( log ⁡ ∑ y ′ p ( x , y ′ ) + log ⁡ ∑ x ′ p ( x ′ , y ) ) ) = − ∑ x , y p ( x , y ) log ⁡ p ( x ) p ( y ) {\displaystyle {\begin{aligned}S(p(x))+S(p(y))&=-\left(\sum _{x}p_{x}\log p(x)+\sum _{y}p_{y}\log p(y)\right)\\&=-\left(\sum _{x}\left(\sum _{y'}p(x,y')\log \sum _{y'}p(x,y')\right)+\sum _{y}\left(\sum _{x'}p(x',y)\log \sum _{x'}p(x',y)\right)\right)\\&=-\left(\sum _{x,y}p(x,y)\left(\log \sum _{y'}p(x,y')+\log \sum _{x'}p(x',y)\right)\right)\\&=-\sum _{x,y}p(x,y)\log p(x)p(y)\end{aligned}}}

So the mutual information is

I ( X : Y ) = ∑ x , y p ( x , y ) log ⁡ p ( x , y ) p ( x ) p ( y ) , {\displaystyle I(X:Y)=\sum _{x,y}p(x,y)\log {\frac {p(x,y)}{p(x)p(y)}},}

Where the logarithm is taken in basis 2 to obtain the mutual information in bits. But this is precisely the relative entropy between p(x, y) and p(x)p(y). In other words, if we assume the two variables x and y to be uncorrelated, mutual information is the discrepancy in uncertainty resulting from this (possibly erroneous) assumption.

It follows from the property of relative entropy that I(X:Y) ≥ 0 and equality holds if and only if p(x, y) = p(x)p(y).

Definition

The quantum mechanical counterpart of classical probability distributions are modeled with density matrices.

Consider a quantum system that can be divided into two parts, A and B, such that independent measurements can be made on either part. The state space of the entire quantum system is then the tensor product of the spaces for the two parts.

H A B := H A ⊗ H B . {\displaystyle H_{AB}:=H_{A}\otimes H_{B}.}

Let ρAB be a density matrix acting on states in HAB. The von Neumann entropy of a density matrix S(ρ), is the quantum mechanical analogy of the Shannon entropy.

S ( ρ ) = − Tr ⁡ ρ log ⁡ ρ . {\displaystyle S(\rho )=-\operatorname {Tr} \rho \log \rho .}

For a probability distribution p(x,y), the marginal distributions are obtained by integrating away the variables x or y. The corresponding operation for density matrices is the partial trace. So one can assign to ρ a state on the subsystem A by

ρ A = Tr B ρ A B {\displaystyle \rho ^{A}=\operatorname {Tr} _{B}\;\rho ^{AB}}

where TrB is partial trace with respect to system B. This is the reduced state of ρAB on system A. The reduced von Neumann entropy of ρAB with respect to system A is

S ( ρ A ) . {\displaystyle \;S(\rho ^{A}).}

S(ρB) is defined in the same way.

It can now be seen that the definition of quantum mutual information, corresponding to the classical definition, should be as follows.

I ( A : B ) := S ( ρ A ) + S ( ρ B ) − S ( ρ A B ) . {\displaystyle \;I(A\!:\!B):=S(\rho ^{A})+S(\rho ^{B})-S(\rho ^{AB}).}

Quantum mutual information can be interpreted the same way as in the classical case: it can be shown that

I ( A : B ) = S ( ρ A B ‖ ρ A ⊗ ρ B ) {\displaystyle I(A\!:\!B)=S(\rho ^{AB}\|\rho ^{A}\otimes \rho ^{B})}

where S ( ⋅ ‖ ⋅ ) {\displaystyle S(\cdot \|\cdot )} denotes quantum relative entropy. Note that there is an alternative generalization of mutual information to the quantum case. The difference between the two for a given state is called quantum discord, a measure for the quantum correlations of the state in question.

Properties

When the state ρ A B {\displaystyle \rho ^{AB}} is pure (and thus S ( ρ A B ) = 0 {\displaystyle S(\rho ^{AB})=0} ), the mutual information is twice the entanglement entropy of the state:

I ( A : B ) = S ( ρ A ) + S ( ρ B ) − S ( ρ A B ) = S ( ρ A ) + S ( ρ B ) = 2 S ( ρ A ) {\displaystyle I(A\!:\!B)=S(\rho ^{A})+S(\rho ^{B})-S(\rho ^{AB})=S(\rho ^{A})+S(\rho ^{B})=2S(\rho ^{A})}

A positive quantum mutual information is not necessarily indicative of entanglement, however. A classical mixture of separable states will always have zero entanglement, but can have nonzero QMI, such as

ρ A B = 1 2 ( | 00 ⟩ ⟨ 00 | + | 11 ⟩ ⟨ 11 | ) {\displaystyle \rho ^{AB}={\frac {1}{2}}\left(|00\rangle \langle 00|+|11\rangle \langle 11|\right)} I ( A : B ) = S ( ρ A ) + S ( ρ B ) − S ( ρ A B ) = S ( 1 2 ( | 0 ⟩ ⟨ 0 | + | 1 ⟩ ⟨ 1 | ) ) + S ( 1 2 ( | 0 ⟩ ⟨ 0 | + | 1 ⟩ ⟨ 1 | ) ) − S ( 1 2 ( | 00 ⟩ ⟨ 00 | + | 11 ⟩ ⟨ 11 | ) ) = log ⁡ 2 + log ⁡ 2 − log ⁡ 2 = log ⁡ 2 {\displaystyle {\begin{aligned}I(A\!:\!B)&=S(\rho ^{A})+S(\rho ^{B})-S(\rho ^{AB})\\&=S\left({\frac {1}{2}}(|0\rangle \langle 0|+|1\rangle \langle 1|)\right)+S\left({\frac {1}{2}}(|0\rangle \langle 0|+|1\rangle \langle 1|)\right)-S\left({\frac {1}{2}}(|00\rangle \langle 00|+|11\rangle \langle 11|)\right)\\&=\log 2+\log 2-\log 2=\log 2\end{aligned}}}

In this case, the state is merely a classically correlated state.