Abstract

Touch plays a crucial role in humans’ nonverbal social and affective communication. It then comes as no surprise to observe a considerable effort that has been placed on devising methodologies for automated touch classification. For instance, such an ability allows for the use of smart touch sensors in such real-life application domains as socially-assistive robots and embodied telecommunication. In fact, touch classification literature represents an undeniably progressive result. However, these results are limited in two important ways. First, they are mostly based on overall (i.e., average) accuracy of different classifiers. As a result, they fall short in providing an insight on performance of these approaches as per different types of touch. Second, they do not consider the same type of touch with different level of strength (e.g., gentle versus strong touch). This is certainly an important factor that deserves investigating since the intensity of a touch can utterly transform its meaning (e.g., from an affectionate gesture to a sign of punishment). The current study provides a preliminary investigation of these shortcomings by considering the accuracy of a number of classifiers for both, within- (i.e., same type of touch with differing strengths) and between-touch (i.e., different types of touch) classifications. Our results help verify the strength and shortcoming of different machine learning algorithms for touch classification. They also highlight some of the challenges whose solution concepts can pave the path for integration of touch sensors in such application domains as human–robot interaction (HRI).

Highlights

  • Touch is one of the most basic and yet highly effective means of nonverbal communication for transfer of affective feelings and the emotions [1]

  • The present study sought to realize the utility of machine learning approaches for classification of gentle vs. strong touch gestures

  • We asked our participants to perform these touch gestures in two different strengths: a gentle and a strong touch. This resulted in total of six different scenarios

Read more

Summary

Introduction

Touch is one of the most basic and yet highly effective means of nonverbal communication for transfer of affective feelings and the emotions [1]. It is apparent that such an ability finds application in a broad real-life domains from socially-assistive robots [6] and robot therapy [7,8] to embodied telecommunication [9]. In this respect, a considerable effort has been placed on devising methodologies for automated touch classification. We computed a new feature; largest connected component (LCC). LCC corresponded to the number of activated sensor’s cells in the largest connected area of the sensor vest. LCC was the connected component that comprised the largest number of active cells among all of these connected components.

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call