Flies have the capability to visually track small moving targets, even across cluttered backgrounds. Previous computational models, based on figure detection (FD) cells identified in the fly, have suggested how this may be accomplished at a neuronal level based on information about relative motion between the target and the background. We experimented with the use of this "small-field system model" for the tracking of small moving targets by a simulated fly in a cluttered environment and discovered some functional limitations. As a result of these experiments, we propose elaborations of the original small-field system model to support stronger effects of background motion on small-field responses, proper accounting for more complex optical flow fields, and more direct guidance toward the target. We show that the elaborated model achieves much better tracking performance than the original model in complex visual environments and discuss the biological implications of our elaborations. The elaborated model may help to explain recent electrophysiological data on FD cells that seem to contradict the original model.
Read full abstract