Abstract

The Internet of Remote Things (IoRT) has emerged as a transformative paradigm, merging IoT capabilities with remote technologies. IoRT environments, featuring interconnected sensors and robots, face challenges like sensor noise and low-light conditions, compromising video stream quality. This paper proposes a Hybrid Video Denoising and Blending Framework to address IoRT video data shortcomings. Leveraging spatial and temporal domain denoising techniques, the framework effectively removes noise while preserving crucial details. The inclusion of advanced blending algorithms facilitates seamless fusion of data from multiple sources, enhancing decision-making in real-world scenarios. The framework adopts a dynamic weighted averaging approach and an optimal sensor selection mechanism to intelligently choose informative data sources, improving blended output quality. Extensive experiments with a diverse IoRT dataset showcase the framework's superiority over state-of-the-art techniques, offering significant enhancements in video quality, noise reduction, and data fusion accuracy. Applications like surveillance, autonomous remotes, and industrial automation can benefit from the framework's ability to provide clearer, more reliable visual information. In conclusion, this research introduces a pioneering approach to mitigate video noise and enhance data fusion in IoRT, showcasing promising results and paving the way for further research in the integration of Remotes and IoT.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call