The no-free-lunch (NFL) theorem is a celebrated result in learning theory that limits one's ability to learn a function with a training dataset. With the recent rise of quantum machine learning, it is natural to ask whether there is a quantum analog of the NFL theorem, which would restrict a quantum computer's ability to learn a unitary process with quantum training data. However, in the quantum setting, the training data can possess entanglement, a strong correlation with no classical analog. In this Letter, we show that entangled datasets lead to an apparent violation of the (classical) NFL theorem. This motivates a reformulation that accounts for the degree of entanglement in the training set. As our main result, we prove a quantum NFL theorem whereby the fundamental limit on the learnability of a unitary is reduced by entanglement. We employ Rigetti's quantum computer to test both the classical and quantum NFL theorems. Our Letter establishes that entanglement is a commodity in quantum machine learning.
Read full abstract