To better understand the behavioral and communication capabilities of Megaptera Novaeangliae, the findings of a recent whale song study suggest an intriguing experiment to assess humpback whale response to acoustically selected visual-feedback cues. The analysis of high-complexity, frequency-modulated song units indicates a Shannon-Hartley-compliant sub-unit architecture similar to human vowel generation. Like constant-pitch English language vowels, which are differentiated by their two most energetic peak resonance frequencies, humpbacks also exhibit precision vocal control of the production of a variety of sub-units of distinct and differentiable harmonic frequency combinations. Humans navigate mobile phone and tablet-PC informational, gaming, and adaptive learning apps using visual feedback from tactile-selected touchscreen icons and hyperlinks. In lieu of tactile manipulation, an alternative approach to touchscreen control is vocal selection of icons and links corresponding to the generation of specific vowel resonance frequencies and sub-unit harmonic frequencies. A software prototype of a voice-controlled “touchscreen” gaming experiment demonstrates how humans and humpback whales could conceivably interact or how humpbacks could engage in informational transactions. The prototype also incorporates video “training” examples designed to guide subjects in the voiced selection of visual symbols assigned to the sub-regions of a big-screen display.