Abstract

AbstractArtificial Intelligence (AI), as a mainstream science today, has the potential to significantly improve human wellbeing and wellness. An automated caretaking system is being developed in this study to enable constant monitoring of people without requiring much human intervention. To do so, we must take into account a wide range of people's movements and varied perspectives in real‐time contexts. The proposed system, coined “Eye‐Tact”, integrates a vision‐based multimodel architecture with wearable sensors to identify poses and detect falls. For people with Parkinson's disease (PD), this patient‐specific, vision‐based keypoint analysis model has been successfully deployed for person identification and aberrant activity recognition. The proposed Multi Model Ensemble Technique (MMET) employs a variety of sensors to acquire data on physiological and other parameters that are necessary for fall prediction and evaluation. The measures used in the proposed system are precision, recall, F1 score and support. The above mentioned parameters are used to evaluate the performance of different models, including XGBoostClassifier, CatBoostClassifier, and RandomForestClassifier. The results reveal that the RantomForestClassifier outperforms other types of classifiers with 97% of accuracy. The proposed work demonstrates its capacity to develop a system that carefully understands and analyses heterogeneous data cautiously using state‐of‐the‐art technologies.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.