Abstract

Classification of complex motor activities from brain imaging is relatively new in the fields of neuroscience and brain-computer interfaces (BCIs). We report sign language classification results for a set of three contrasting pairs of signs. Executed sign accuracy was 93.3%, and imagined sign accuracy was 76.7%. For a full multiclass problem, we used a decision directed acyclic graph of pairwise support vector machines, resulting in 63.3% accuracy for executed sign and 31.4% accuracy for imagined sign. Pairwise comparison of phrases composed of these signs yielded a mean accuracy of 73.4%. These results suggest the possibility of BCIs based on sign language.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call