Conventional electromyography (EMG) measures the continuous neural activity during muscle contraction, but lacks explicit quantification of the actual contraction. Mechanomyography (MMG) and accelerometers only measure body surface motion, while ultrasound, CT-scan and MRI are restricted to in-clinic snapshots. Here we propose a novel radiomyography (RMG) for continuous muscle actuation sensing that can be wearable or touchless, capturing both superficial and deep muscle groups. We verified RMG experimentally by a wearable forearm sensor for hand gesture recognition (HGR). We first converted the sensor outputs to the time-frequency spectrogram, and then employed the vision transformer (ViT) deep learning network as the classification model, which can recognize 23 gestures with an average accuracy up to 99% on 8 subjects. By transfer learning, high adaptivity to user difference and sensor variation were achieved at an average accuracy up to 97%. We further extended RMG to monitor eye and leg muscles and achieved high accuracy for eye movement and body posture tracking. RMG can be used with synchronous EMG to derive stimulation-actuation waveforms for many potential applications in kinesiology, physiotherapy, rehabilitation, and human-machine interface.