Abstract

The architecture of IoT healthcare is motivated towards the data-driven realization and patient-centric health models, whereas the personalized assistance is provided by deploying the advanced sensors. According to the procedures in surgery, in the emergency unit, the patients are monitored till they are stable physically and then shifted to ward for further recovery and evaluation. Normally evaluation done in ward doesn’t suggest continuous parameters monitoring for physiological condition and thus relapse of patients are common. In real-time healthcare applications, the vital parameters will be estimated through dedicated sensors, that are still luxurious at the present situation and highly sensitive to harsh conditions of environment. Furthermore, for real-time monitoring, delay is usually present in the sensors. Because of these issues, data-driven soft sensors are highly attractive alternatives. This research is motivated towards this fact and Auto Encoder Deep Neural Network (AutoEncDeepNN) is proposed depending on Health Framework in the internet assisting the patients with trigger-based sensor activation model to manage master and slave sensors. The advantage of the proposed method is that the hidden information are mined automatically from the sensors and high representative features are generated by multiple layer’s iteration. This goal is consistently achieved and thus the proposed model outperforms few standard approaches which are considered like Hierarchical Extreme Learning Machine (HELM), Convolutional Neural Network (CNN) and Long Short-Term Memory (LSTM). It is found that the proposed AutoEncDeepNN method achieves 94.72% of accuracy, 41.96% of RMSE, 34.16% of RAE and 48.68% of MAE in 74.64 ms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.