The visual system of the fly performs various computations on photoreceptor outputs. The detection and measurement of movement is based on simple nonlinear multiplication-like interactions between adjacent pairs and groups of photoreceptors. The position of a small contrasted object against a uniform background is measured, at least in part, by (formally) 1-input nonlinear flicker detectors. A fly can also detect and discriminate a figure that moves relative to a ground texture. This computation of relative movement relies on a more complex algorithm, one which detects discontinuities in the movement field. The experiments described in this paper indicate that the outputs of neighbouring movement detectors interact in a multiplication-like fashion and then in turn inhibit locally the flicker detectors. The following main characteristic properties (partly a direct consequence of the algorithm's structure) have been established experimentally: a) Coherent motion of figure and ground inhibit the position detectors whereas incoherent motion fails to produce inhibition near the edges of the moving figure (provided the textures of figure and ground are similar). b) The movement detectors underlying this particular computation are direction-insensitive at input frequencies (at the photoreceptor level) above 2.3 Hz. They become increasingly direction-sensitive for lower input frequencies. c) At higher input frequencies the fly cannot discriminate an object against a texture oscillating at the same frequency and amplitude at 0° and 180° phase, whereas 90° or 270° phase shift between figure and ground oscillations yields maximum discrimination. d) Under conditions of coherent movement, strong spatial incoherence is detected by the same mechanism. The algorithm underlying the relative movement computation is further discussed as an example of a coherence measuring process, operating on the outputs of an array of movement detectors. Possible neural correlates are also mentioned.
Read full abstract