Abstract

This paper explores a novel human–machine interaction (HMI) paradigm that utilizes the sensing, storage, computation, and communication (SSCC) power capabilities of mobile devices to provide intuitive interactions with dynamic systems. The HMI paradigm addresses the fundamental challenges by integrating computer vision, 3D virtual graphics, and touchscreen sensing to develop mobile apps that provide interactive augmented reality (AR) visualizations. While prior approaches used laboratory-grade hardware, e.g., personal computer (PC), vision system, etc., for streaming video to remote users, the approach exploits the inherent mobility of mobile devices to provide users with mixed-reality (MR) environments in which the laboratory test-bed and augmented visualizations coexist and interact in real-time to promote immersive learning experiences that don’t yet exist in engineering laboratories. By pointing the rear-facing cameras of the mobile devices at the system from an arbitrary perspective, computer vision techniques retrieve physical measurements to render interactive AR content or perform feedback control. Future work is expected to examine the potential of our approach in teaching fundamentals of dynamic systems, automatic control, robotics, etc. through inquiry-based activities with students.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call