This research presents an innovative approach that integrated gesture recognition into a Mixed Reality (MR) interface for human–machine collaboration in the quality control, fabrication, and assembly of the Unlog Tower. MR platforms enable users to interact with three-dimensional holographic instructions during the assembly and fabrication of highly custom and parametric architectural constructions without the necessity of two-dimensional drawings. Previous MR fabrication projects have primarily relied on digital menus and custom buttons within the interface for user interaction between virtual and physical environments. Despite this approach being widely adopted, it is limited in its ability to allow for direct human interaction with physical objects to modify fabrication instructions within the virtual environment. The research integrates user interactions with physical objects through real-time gesture recognition as input to modify, update, or generate new digital information. This integration facilitates reciprocal stimuli between the physical and virtual environments, wherein the digital environment is generative of the user’s tactile interaction with physical objects. Thereby providing user with direct, seamless feedback during the fabrication process. Through this method, the research has developed and presents three distinct Gesture-Based Mixed Reality (GBMR) workflows: object localization, object identification, and object calibration. These workflows utilize gesture recognition to enhance the interaction between virtual and physical environments, allowing for precise localization of objects, intuitive identification processes, and accurate calibrations. The results of these methods are demonstrated through a comprehensive case study: the construction of the Unlog Tower, a 36’ tall robotically fabricated timber structure.