Abstract

In mixed-modality psychophysical scaling, stimuli from different modalities are presented alternately for judgment on the same scale. The usual purpose is to produce cross-modality matching functions without actually doing cross-modality matches. This paper reports the results of two experiments that extend the method to situations in which the responses, themselves crossmodality matches on an easy-to-control continuum (duration), are used to derive matching functions for two difficult-to-control continua (here loudness and brightness). Derived cross-modality matching functions are highly similar to those obtained from magnitude estimation or category judgment responses. First- and second-order sequential dependencies also closely resemble those found in data from the methods that employ numerical response scales, with one exception. For the first time in these studies of mixed-modality scaling, current responses sometimes were found to be weakly contrasted to the values of previous stimuli of different modality from the current stimulus. The various sequential dependencies found may arise from different levels of processing, with intramodal response-stimulus contrast arising from sensory differentiation, inter- and intramodality response-response assimilation from perceptual categorization processes, and intermodality response-stimulus contrast from cognitive expectancies.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call