The objective of this work focuses on multiple independent user profiles that capture behavioral, emotional, medical, and physical patterns in the working and living environment resulting in one general user profile. Depending on the user's current activity (e.g. walking, eating, etc.), medical history, and other influential factors, the developed framework acts as a supplemental assistant to the user by providing not only the ability to enable supportive functionalities (e.g. image filtering, magnification, etc.) but also informative recommendations (e.g. diet, alcohol, etc.). The personalization of such a profile lies within the user's past preferences using human activity recognition as a base, and it is achieved through a statistical model, the Bayesian belief network. Training and real-time methodological pipelines are introduced and validated. The employed deep learning techniques for identifying human activities are presented and validated in publicly available and in-house datasets. The overall accuracy of human activity recognition reaches up to 86.96 %.
Read full abstract