Abstract

AbstractThe concept of attention has proven to be very relevant in artificial intelligence. Relative entropy (RE, aka Kullback‐Leibler divergence) plays a central role in communication theory. Here, these concepts, attention, and RE are combined. RE guides optimal encoding of messages in bandwidth‐limited communication as well as optimal message decoding via the maximum entropy principle. In the coding scenario, RE can be derived from four requirements, namely being analytical, local, proper, and calibrated. Weighted RE, used for attention steering in communications, turns out to be improper. To see how proper attention communication can emerge, a scenario of a message sender who wants to ensure that the receiver of the message can perform well‐informed actions is analyzed. In case only the curvature of the utility function maxima are known, it becomes desirable to accurately communicate an attention function, in this case a by this curvature weighted and re‐normalized probability function. Entropic attention communication is here proposed as the desired generalization of entropic communication that permits weighting while being proper, thereby aiding the design of optimal communication protocols in technical applications and helping to understand human communication. It provides the level of cooperation expected under misaligned interests of otherwise honest communication partners.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.