Predicting human behaviour is a complex task. Traditional methods often rely on explicit user input or external observation, which can be restrictive and impractical in real-world scenarios. As an alternative, Brain-Computer Interfaces (BCIs) offer a more direct and specific means of accessing cognitive and emotional states, providing valuable insights into human intentions and decision-making processes. This paper proposes a novel method that predicts and suggests personalised emotion-based activities for individual users based on multi-modal sensory data collected from the brain, body, and environment. Our method overcomes the limitations of conventional systems by incorporating a multi-modal data collection set throughout the day to understand user context and intent better. By analysing this data, we predict the emotions-based practice of the user's day. We train our method using state-of-the-art, nature-inspired reinforcement learning algorithms and agent technology to optimise its optimisations and personalised continuously. The performance evaluation shows that the accuracy and F1 score for the proposed method achieved 95.6% and 84%, respectively, achieving 2 to 3% more accuracy than AI-based emotion state-of-the-art detection methods.
Read full abstract- All Solutions
Editage
One platform for all researcher needs
Paperpal
AI-powered academic writing assistant
R Discovery
Your #1 AI companion for literature search
Mind the Graph
AI tool for graphics, illustrations, and artwork
Journal finder
AI-powered journal recommender
Unlock unlimited use of all AI tools with the Editage Plus membership.
Explore Editage Plus - Support
Overview
9275 Articles
Published in last 50 years
Articles published on Cognitive Systems
Authors
Select Authors
Journals
Select Journals
Duration
Select Duration
8616 Search results
Sort by Recency