Abstract
In this paper we derive a generalization of the vector entropy power inequality (EPI) recently put forth in [1], which was valid only for diagonal matrices, to the full matrix case. Next, we study the problem of computing the linear precoder that maximizes the mutual information in linear vector Gaussian channels with arbitrary inputs. In particular, we transform the precoder optimization problem into a new form and, capitalizing on the newly unveiled matrix EPI, we show that some particular instances of the optimization problem can be cast in convex form, i.e., we can have an optimality certificate, which, to the best of our knowledge, had never been obtained previously.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.