Mobile handheld devices, such as smartphones and tablets, have become some of the most prominent ubiquitous terminals within the information and communication technology landscape. Their transformative power within the digital music domain changed the music ecosystem from production to distribution and consumption. Of interest here is the ever-expanding number of mobile music applications. Despite their growing popularity, their design in terms of interaction perception and control is highly arbitrary. It remains poorly addressed in related literature and lacks a clear, systematized approach. In this context, our paper aims to provide the first steps towards defining guidelines for optimal sonic interaction design practices in mobile music applications. Our design approach is informed by user data in appropriating mobile handheld devices. We conducted an experiment to learn links between control gestures and musical parameters, such as pitch, duration, and amplitude. A twofold action—reflection protocol and tool-set for evaluating the aforementioned links—are also proposed. The results collected from the experiment show statistically significant trends in pitch and duration control gesture mappings. On the other hand, amplitude appears to elicit a more diverse mapping approach, showing no definitive trend in this experiment.