Flying insects rely mainly upon visual motion to detect and track objects. There has been a lot of research on fly inspired algorithms for object detection, but few have been developed based on visual motion alone. One of the daunting difficulties is that the neural and circuit mechanisms underlying the foreground-background segmentation are still unclear. Our previous modeling study proposed that the lobula held parallel pathways with distinct directional selectivity, each of which could retinotopically discriminate figures moving in its own preferred direction based on relative motion cues. The previous model, however, did not address how the multiple parallel pathways gave the only detection output at their common downstream. Since the preferred directions of the pathways along either horizontal or vertical axis were opposite to each other, the background moving in the opposite direction to an object also activated the corresponding lobula pathway. Indiscriminate or ungated projection from all the pathways to their downstream would mix objects with the moving background, making the previous model fail with non-stationary background. Here, we extend the previous model by proposing that the background motion-dependent gating of individual lobula projections is the key to object detection. Large-field lobula plate tangential cells are hypothesized to perform the gating to realize bioinspired background subtraction. The model is shown to be capable of implementing a robust detection of moving objects in video sequences with either a moving camera that induces translational optic flow or a static camera. The model sheds light on the potential of the concise fly algorithm in real-world applications.
Read full abstract