Abstract

It is of paramount importance to track the cognitive activity or cognitve attenion of the service personnel in a Prognostics and Health Monitoring (PHM) service related training or operation environment. The electroencephalography (EEG) data is one of the good candidates for cognitive activity recognition of the user. Analyzing electroencephalography (EEG) data in an unconstrained (natural) environment for understanding cognitive state and classifying human activity is a challenging task due to multiple reasons such as low signal-to-noise ratio, transient nature, lack of baseline availability and uncontrolled mixing of various tasks. This paper proposes a framework based on an emerging tool named deep learning that monitors human activity by fusing multiple EEG sensors and also selects a smaller sensor suite for a lean data collection system. Real-time classification of human activity from spatially non collocated multi-probe EEG is executed by applying deep learning techniques without performing any significant amount of data preprocessing and manual feature engineering. Two types of deep neural networks, deep belief network (DBN) and deep convolutional neural network (DCNN) are used at the core of the proposed framework, which automatically learns necessary features from EEG for a given classification task. Validation on extensive amount of data, which was collected from several subjects while they were performing multiple tasks (listening and watching) in PHM service training session, is presented and significant parallels are drawn from existing domain knowledge on EEG data understanding. Comparison with machine learning benchmark techniques shows that deep learning based tools are better at understanding EEG data for task classification. It is observed via sensor selection that a significantly smaller EEG sensor suite can perform at a comparable accuracy as the original sensor suite.

Highlights

  • It is becoming an ubiquitous practice in industry for the field technicians to use wearables while performing Prognostics and Health Monitoring (PHM) related service

  • This paper proposes an architecture based on deep learning which monitors human activity in real time by fusing multiple EEG sensors in an unconstrained environment and selects a smaller sensor suite for a lean data collection system

  • The EEG data collected from first and third participants are used for constructing training set and remaining data from the second participant is used for testing

Read more

Summary

Introduction

It is becoming an ubiquitous practice in industry for the field technicians to use wearables (with multi-modal sensor nodes) while performing PHM related service. Real-time tracking of the the service personnel’s cognitive activity in a Prognostics and Health Monitoring (PHM) related environment is significant both for designing an effective multi-media training module and evaluating the quality of service at the PHM-critical industries. EEG data is a preferable non-invasive candidate for tracking human activity. Real-time understanding of individual workload, fatigue and alertness of field maintenance personnel facilitates the process of creating an efficient and safe work environment. Most existing state-of-the-art techniques analyze EEG data collected in a constrained environment along with a known baseline activity response. For a real work environment, an activity monitoring process along with a lean data collection system have to be developed which can process EEG data and recognize human activity in real-time without baseline knowledge

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call