Abstract

This paper describes an auditory robotic system capable of computing the angle of incidence of a sound source on the horizontal plane (azimuth). The system, with the use of an Elman type recurrent neural network (RNN), is able to dynamically track this sound source as it changes azimuthally within the environment. The RNN is used to enable fast tracking responses to the overall system over a set time, as opposed to waiting for the next sound position before moving. The system is first tested in a simulated environment and then these results are compared with testing on the robotic system. The results show that the development of a hybrid system incorporating cross-correlation and recurrent neural networks is an effective mechanism for the control of a robot that tracks sound sources azimuthally.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call