Elderly and vision-impaired indoor activity monitoring uses sensor technology to monitor interaction and movement in the living area. This system can identify deviations from regular patterns, provide alerts, and ensure safety in case of emergencies or potential risks. These solutions enhance quality of life by promoting independent living while offering peace of mind to loved ones and caregivers. Visual impairments have become the chief part of these problems over the past few years. Usually, visually impaired people seek help from others to keep doing their daily tasks. An automated human activity recognition (HAR) technique could enhance a vision-impaired person’s transaction activity and safe movement. HAR using deep learning (DL) includes training neural networks to automatically detect and classify various activities carried out by humans based on sensor information. By leveraging DL approaches, the HAR system could accurately identify and classify activities, including sitting, standing, walking, or running, from the information captured by sensors or wearable devices. This system is integral to smart home automation, healthcare, and sports analytics. This study develops an Automated Fractal Quantum Salp Swarm Algorithm with Machine Learning based HAR (QSSA-MLHAR) for assisting elderly and visually impaired people. The QSSA-MLHAR technique uses sensory input data to identify human activities in diverse classes. In the QSSA-MLHAR technique, the min–max scalar is primarily used to scale the input data. Besides, the QSSA-MLHAR technique applies a mixed extreme learning machine (M-ELM) model to classify various human actions. The QSSA-based hyperparameter tuning process is used to improve the detection results of the M-ELM system. A wide range of simulations was performed to ensure the better performance of the QSSA-MLHAR method. The comprehensive outcomes assured that the QSSA-MLHAR technique performs well compared to other models.