Abstract

Over the last decade, researchers have studied the interplay between quantum computing and classical machine learning algorithms. However, measurements often disturb or destroy quantum states, requiring multiple repetitions of data processing to estimate observable values. In particular, this prevents online (real-time, single-shot) processing of temporal data as measurements are commonly performed during intermediate stages. Recently, it was proposed to sidestep this issue by focusing on tasks with quantum output, eliminating the need for detectors. Inspired by reservoir computers, a model was proposed where only a subset of the internal parameters are trained while keeping the others fixed at random values. Here, we also process quantum time series, but we do so using a Recurrent Gaussian Quantum Network (RGQN) of which all internal interactions can be trained. As expected, this increased flexibility yields higher performance in benchmark tasks. Building on this, we show that the RGQN can tackle two quantum communication tasks, while also removing some hardware restrictions of the currently available methods. First, our approach is more resource efficient to enhance the transmission rate of quantum channels that experience certain memory effects. Second, it can counteract similar memory effects if they are unwanted, a task that could previously only be solved when redundantly encoded input signals could be provided. Finally, we run a small-scale version of the last task on Xanadu’s photonic processor Borealis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call