Computer-based interactive music generation systems have an advantage over acoustic instruments with respect to the flexibility of the human interface for their control. Whereas human performers must conform to the relatively invariant constraints of acoustic instruments, a computer interface can be rapidly updated with regard to changes in user preferences and performance requirements. This paper will describe a software environment that executes miniature psychophysical scaling experiments for a single user of an interactive music generation system in order to derive individualized control over music synthesis via inversion of the obtained psychophysical scaling results. A case study will be presented in which the system has been used to control parametric synthesis of musical timbres using an electronic musical keyboard, providing automatic perceptual mapping of synthesis parameters within their musically useful range. More in-depth exploration of the timbral similarities between a user-selected set of synthesis patches resulted in low-dimensional control structures that could be used to organize musical timbres in preparation for composition or performance.