Abstract

We have developed a wearable device that records the activities of human–human and human–artifact interactions. Using microphones and cameras, the device imitates human perception, recording personal and social everyday-life experiences in multiple modalities, such as voice and visible scenes. These sensors record the perceived experiences continuously, and detect and index interactions from nonverbal behavior. The indexed stored experiences can serve as the first step toward a multimodal knowledge base created from daily life. An infrared LED ID tag system detects interactions, in terms of the ID and the relative positions of objects within the camera's visual field. In this study, we propose an "interaction scope" which is defined as the range of relative human–object positions that have a high probability of occurring in conversational interactions. Analysis of experimental conversational sessions confirms that this interaction scope exists and can represent these interactions naturally. We also demonstrate that our tag system effectively detects and measures the proposed interaction scope.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call