Abstract

Human Activity Recognition (HAR) has gained traction in recent years in diverse areas such as observation, entertainment, teaching and healthcare, using wearable and smartphone sensors. Such environments and systems necessitate and subsume activity recognition, aimed at recognizing the actions, characteristics, and goals of one or more individuals from a temporal series of observations streamed from one or more sensors. Different developed models for HAR have been explained in literature. Deep learning systems and algorithms were shown to perform highly in HAR in recent years, but these algorithms need lots of computerization to be deployed efficiently in applications. This paper presents a HAR lightweight, low computing capacity, deep learning model, which is ideal for use in real-time applications. The generic HAR framework for smartphone sensor data is proposed, based on Long Short-Term Memory (LSTM) networks for time-series domains and standard Convolutional Neural Network (CNN) used for classification. The findings demonstrate that many of the deployed deep learning and machine learning techniques are surpassed by the proposed model. TRANSLATE with x English Arabic Hebrew Polish Bulgarian Hindi Portuguese Catalan Hmong Daw Romanian Chinese Simplified Hungarian Russian Chinese Traditional Indonesian Slovak Czech Italian Slovenian Danish Japanese Spanish Dutch Klingon Swedish English Korean Thai Estonian Latvian Turkish Finnish Lithuanian Ukrainian French Malay Urdu German Maltese Vietnamese Greek Norwegian Welsh Haitian Creole Persian // TRANSLATE with COPY THE URL BELOW Back EMBED THE SNIPPET BELOW IN YOUR SITE Enable collaborative features and customize widget: Bing Webmaster Portal Back // ORIGINAL: "; langMenu.appendChild(origLangDiv); LanguageMenu.Init('LanguageMenu', LanguageMenu_keys, LanguageMenu_values, LanguageMenu_callback, LanguageMenu_popupid); window["LanguageMenu"] = LanguageMenu; clearInterval(intervalId); } }, 1); // ]]>

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.