The paper by A.S. Holevo addresses the capacity of a quantum channel with arbitrary (possibly mixed) signal states. It demonstrates that the capacity of such a channel equals the maximum of the entropy bound with respect to all a priori distributions. This result completes a recent finding by Hausladen, Jozsa, Schumacher, Westmoreland, and Wooters, who proved the equality for pure state channels. The main theorem states that the capacity \( C \) is given by:
\[ C = \max_{\pi} [H(\sum_{i \in A} \pi_i S_i) - \sum_{i \in A} \pi_i H(S_i)], \]
where \( H \) denotes the von Neumann entropy. The proof involves projecting onto the typical subspace for mixed states and estimating the error probability, which is more complex than the case for pure states. The key idea is to use the projection onto the typical subspace, modified for mixed states, and to estimate the error probability using a Gram matrix and trace properties. The random coding argument shows that the error probability decreases exponentially with the size of the code, leading to the desired inequality.The paper by A.S. Holevo addresses the capacity of a quantum channel with arbitrary (possibly mixed) signal states. It demonstrates that the capacity of such a channel equals the maximum of the entropy bound with respect to all a priori distributions. This result completes a recent finding by Hausladen, Jozsa, Schumacher, Westmoreland, and Wooters, who proved the equality for pure state channels. The main theorem states that the capacity \( C \) is given by:
\[ C = \max_{\pi} [H(\sum_{i \in A} \pi_i S_i) - \sum_{i \in A} \pi_i H(S_i)], \]
where \( H \) denotes the von Neumann entropy. The proof involves projecting onto the typical subspace for mixed states and estimating the error probability, which is more complex than the case for pure states. The key idea is to use the projection onto the typical subspace, modified for mixed states, and to estimate the error probability using a Gram matrix and trace properties. The random coding argument shows that the error probability decreases exponentially with the size of the code, leading to the desired inequality.