Abstract

Early detection of influenza-like symptoms can prevent widespread flu viruses and enable timely treatments, particularly in the post-pandemic era. Mobile sensing leverages an increasingly diverse set of embedded sensors to capture fine-grained information of human behaviors and ambient contexts, and can serve as a promising solution for influenza-like symptom recognition. Traditionally, handcrafted and high level features of mobile sensing data are extracted by manual feature engineering and convolutional/recurrent neural network respectively. In this work, we apply graph representation to encode the dynamics of state transitions and internal dependencies in human behaviors, leverage graph embeddings to automatically extract the topological and spatial features from graph inputs, and propose an end-to-end graph neural network (GNN) model with multi-channel mobile sensing input for influenzalike symptom recognition based on people's daily mobility, social interactions, and physical activities. Using data generated from 448 participants, we show that GNN with GraphSAGE convolutional layers significantly outperforms baseline models with handcrafted features. Furthermore, we use GNN interpretability method to generate insights (e.g., important nodes and graph structures) about the importance of mobile sensing for recognizing Influenza-like symptoms. To the best of our knowledge, this is the first work that applies graph representation and graph neural network on mobile sensing data for graph-based human behavior modeling and health symptoms prediction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call