During the measurement of spectral luminous efficiency function $$V\left( \lambda \right)$$, monochromatic visual stimuli of higher luminance can be obtained by narrow band optical filters with broad band light sources, compared to monochromators. However, the distortion of $$V\left( \lambda \right)$$ is inevitably caused due to the bandwidth of the optical filters. In this paper, the effect of optical filters and light source spectra on the spectral luminous efficiency function $$V\left( \lambda \right)$$ measurement is studied. Filters with different spectral transmittances and light sources with different spectral power distributions are considered in the model to simulate visual sensitivity experiment. Compared with real $$V\left( \lambda \right)$$, the distortion of the simulated $$V\left( \lambda \right)$$ caused by the bandwidth of filters is quantitatively evaluated with the aid of a quality indicator, mismatch error $$f'$$. Results and conclusions in this paper can be used as a guide to optimize the design of the visual sensitivity experiments for $$V\left( \lambda \right)$$ measurement.