Abstract This paper presents an articulatory biofeedback system and discusses new research methods made possible by this technology. The real-time electromagnetic articulography biofeedback system (RT-EMA) enables speakers to observe a visual representation of the movements of their speech articulators while they are speaking. Investigators can dynamically control the visual display of virtual targets or other objects in vocal tract space, track events involving interactions between virtual objects and articulators, and define custom actions in response to such events. Preliminary findings from experimental studies and games employing biofeedback are reported, with emphasis on the potential applications of articulatory biofeedback for investigating questions of linguistic interest.