Nowadays, there is a growing interest among researchers in health monitoring using body sensor data due to its usage in diverse applications. Human Activity Recognition (HAR) utilizes smart wearable devices, particularly in healthcare, to help the elderly population improve their lifestyle by preventing forthcoming exceptional events such as health risks and falls. Although various types of sensors are available to recognize human behaviors, wearable sensors provide consistent information about the lifestyle and functionality of subjects. However, accurate recognition requires intelligent algorithms to extract temporal features of time series data from raw sensor readings. In order to recognize human activities more accurately from time series data, this paper proposes a crossover Aquila-optimized Tri-kernel Extreme Learning Machine (CSA-TkELM) classifier. The sensor data utilized to evaluate the effectiveness of the proposed CSA-TkELM classifier is obtained from the USC Human Activity Dataset (USC-HAD), the Public domain dataset for Real-life human activity recognition using smartphone Sensors (PDS-SS), and the OPPORTUNITY dataset. The direct application of raw time series data to the classifier affects its recognition performance. Therefore, preprocessing steps such as noise removal, handling missing values, standardization, and segmentation are executed. The proposed CSA-TkELM classifier extracts significant temporal features from the time series information and recognizes activities. Several performance metrics, such as recall, precision, accuracy, and F1 score, are measured to investigate the results of the proposed CSA-TkELM classifier. The proposed CSA-TkELM classifier achieves a recall of 98.1%, precision of 97.6%, accuracy of 97.9%, and F1 score of 97.3%, which are higher than those of other compared existing techniques.
Read full abstract