Abstract

Associative memories are devices storing information that can be fully retrieved given partial disclosure of it. We examine a toy model of associative memory and the ultimate limitations to which it is subjected within the framework of general probabilistic theories (GPTs), which represent the most general class of physical theories satisfying some basic operational axioms. We ask ourselves how large the dimension of a GPT should be so that it can accommodate 2 m states with the property that any N of them are perfectly distinguishable. Call the minimal such dimension. Invoking an old result by Danzer and Grünbaum, we prove that , to be compared with when the GPT is required to be either classical or quantum. This yields an example of a task where GPTs outperform both classical and quantum theory exponentially. More generally, we resolve the case of fixed N and asymptotically large m, proving that (as ) for every , which yields again an exponential improvement over classical and quantum theories. Finally, we develop a numerical approach to the general problem of finding the largest N-wise mutually distinguishable set for a given GPT, which can be seen as an instance of the maximum clique problem on N-regular hypergraphs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call