Classifying foliage targets using echolocation is important for recognizing landmarks by bats using ultrasonic emissions and blind human echolocators (BEs) using palatal clicks. Previous attempts to classify foliage used ultrasonic frequencies and single sensor (monaural) detection. Motivated by the echolocation capabilities of BEs, a biomimetic sonar emitting audible clicks acquired 5600 binaural echoes from five sequential emissions that probed two foliage targets at aspect angles separated by 18°. Echo spectrograms formed feature vector inputs to artificial neural networks (ANNs) for classifying two targets, Ficus benjamina and Schefflera arboricola, with leaf areas that differ by a factor of four. Classification performances of ANNs without and with hidden layers were analyzed using tenfold cross-validation. Performance improved with input feature size, with binaural echo classification outperforming that using monaural echoes for the same number of emissions and for the same number of echoes. Linear classification accuracy was comparable to that using nonlinear classification with both achieving fewer than 1% errors with binaural spectrogram features from five sequential emissions. This result was better by a factor of 20 compared to previous classification of these targets using only the time envelopes of the same echoes.
Read full abstract