Abstract
Having a distance measure between quantum states satisfying the right properties is of fundamental importance in all areas of quantum information. In this work, we present a systematic study of the geometric Rényi divergence (GRD), also known as the maximal Rényi divergence, from the point of view of quantum information theory. We show that this divergence, together with its extension to channels, has many appealing structural properties, which are not satisfied by other quantum Rényi divergences. For example we prove a chain rule inequality that immediately implies the “amortization collapse” for the geometric Rényi divergence, addressing an open question by Berta et al. [Letters in Mathematical Physics 110:2277–2336, 2020, Equation (55)] in the area of quantum channel discrimination. As applications, we explore various channel capacity problems and construct new channel information measures based on the geometric Rényi divergence, sharpening the previously best-known bounds based on the max-relative entropy while still keeping the new bounds single-letter and efficiently computable. A plethora of examples are investigated and the improvements are evident for almost all cases.
Highlights
In information theory, an imperfect communication link between a sender and a receiver is modeled as a noisy channel
The first two inequalities follow since the Rains information R(N ) has been proved to be a strong converse bound on the unassisted quantum capacity [11]
The private capacity of a quantum channel is defined as the maximum rate at which classical information can be transmitted privately from the sender (Alice) to the receiver (Bob)
Summary
An imperfect communication link between a sender and a receiver is modeled as a noisy channel. It satisfies the chain rule for any quantum states ρR A, σR A and quantum channels N and M, Dα(NA→B (ρR A) MA→B (σR A)) ≤ Dα(ρR A σR A) + Dα(N M) These properties set a clear difference of GRD with other Rényi divergences. We construct new channel information measures based on the geometric Rényi divergence, sharpening the previous bounds based on the max-relative entropy in general while still keeping the new bounds single-letter and efficiently computable. From the technical side, we showcase that the geometric Rényi divergence, which has not been exploited so far in the quantum information literature, is quite useful for channel capacity problems. Our new capacity bounds meet all the aforementioned desirable criteria and improve the previously bestknown results in general, making them suitable as new benchmarks for computing the capacities of quantum channels
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have