SummaryWireless multimodal interactions over long distance (WMILD) would give rise to numerous thrilling applications, such as remote touching and immersive teleoperations. However, long distances can induce large propagation delays, which makes it difficult to meet the ultra‐low latency requirements in haptic‐visual interactions. Considering existing works mainly focused on the wireless access part, this paper designs an end‐to‐end framework for general WMILD applications based on the digital twin (DT) technology and proposes an intelligent resource allocation and parameter compression scheme to guarantee WMILD performance under constraint network resources. In the framework, user device can acquire real‐time remote interactions by performing local interactions with nearby base station (BS), where a DT of the remote side is deployed to predict the remote haptic‐visual feedbacks. A reliable DT updating process is carefully designed to guarantee the DT accurately model its dynamic physical counterpart. To optimize the updating reliability, we formulate the resource allocation and parameter compression to be a constraint‐Markov decision problem, under the constraints on energy consumption, multimodal interactions and updating latencies. Then, a safe deep reinforcement learning algorithm is proposed to adapt resources and compression according to the dynamic DT updating workload, multimodal data‐streams and remote transmission capacities. Simulation shows the framework can achieve high updating reliability compared with baselines.
Read full abstract