Abstract

AbstractData-driven knowledge discovery is becoming a new trend in various scientific fields. In light of this, the goal of the present paper is to introduce a novel framework to study one interesting topic in cognitive and behavioral studies – multimodal communication between human-human and human-robot interaction. We present an overall solution from data capture, through data coding and validation, to data analysis and visualization. In data collection, we have developed a multimodal sensing system to gather fine-grained video, audio and human body movement data. In data analysis, we propose a hybrid solution based on visual data mining and information-theoretic measures. We suggest that this data-driven paradigm will lead not only to breakthroughs in understanding multimodal communication, but will also serve as a successful case study to demonstrate the promise of data-intensive discovery which can be applied in various research topics in cognitive and behavioral studies.KeywordsScientific DiscoveryCognitive and Behavioral StudiesHuman-Human InteractionHuman-Robot InteractionInformation VisualizationData mining

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call