© 1963-2012 IEEE. 'Bounds on information combining' are entropic inequalities that determine how the information (entropy) of a set of random variables can change when these are combined in certain prescribed ways. Such bounds play an important role in classical information theory, particularly in coding and Shannon theory; entropy power inequalities are special instances of them. The arguably most elementary kind of information combining is the addition of two binary random variables (a CNOT gate), and the resulting quantities play an important role in belief propagation and polar coding. We investigate this problem in the setting where quantum side information is available, which has been recognized as a hard setting for entropy power inequalities. Our main technical result is a non-trivial, and close to optimal, lower bound on the combined entropy, which can be seen as an almost optimal 'quantum Mrs. Gerber's Lemma'. Our proof uses three main ingredients: 1) a new bound on the concavity of von Neumann entropy, which is tight in the regime of low pairwise state fidelities; 2) the quantitative improvement of strong subadditivity due to Fawzi-Renner, in which we manage to handle the minimization over recovery maps; and 3) recent duality results on classical-quantum-channels due to Renes et al. We furthermore present conjectures on the optimal lower and upper bounds under quantum side information, supported by interesting analytical observations and strong numerical evidence. We finally apply our bounds to polar coding for binary-input classical-quantum channels, and show the following three results: 1) even non-stationary channels polarize under the polar transform; 2) the blocklength required to approach the symmetric capacity scales at most sub-exponentially in the gap to capacity; and 3) under the aforementioned lower bound conjecture, a blocklength polynomial in the gap suffices.
|Journal||IEEE Transactions on Information Theory|
|Publication status||Published - 1 Jul 2018|
- Information theory
- binary codes
- channel capacity
- channel coding
- information entropy