Abstract

Computational design is increasingly interested in the active feedback between the user/designer and the digital space. Often, our initial instinct as designers comes from a gesture, a movement of the hands that gets translated into sketches and 3D models via the tools available to us. While the physical realm allows for muscle memory, tactile feedback, and creative output via movement, digital design often negates the body of the designer as it sequesters us into a screen-mouse-hand relationship. Moreover, current CAD software tools often reinforce this standardization, further limiting the potential of physical bodily gestures as a vehicle for architectural form-making. Seeking new opportunities for a gestural interface, this research explores how Machine Learning and parametric design tools can be used to translate active movements and gestural actions into rich and complex digital models without the need of specialized equipment. In this paper, we present an open-source and economically accessible methodology for designers to translate hand movements into the digital world, implementing the MediaPipe Hands tracking library. In developing this workflow, this research explores opportunities to create more direct, vital links between expressive gesture and architectural form, with an emphasis on creating platforms that are accessible not only to design experts, but also the broader public.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call