In recent years several world-wide ambient assisted living (AAL) programs have been activated in order to improve the quality of life of older people, and to strengthen the industrial base through the use of information and communication technologies. An important issue is extending the time that older people can live in their home environment, by increasing their autonomy and helping them to carry out activities of daily livings (ADLs). Research in the automatic detection of falls has received a lot of attention, with the object of enhancing safety, emergency response and independence of the elderly, at the same time comparing the social and economic costs related to fall accidents. In this work, an algorithmic framework to detect falls by using a 3D time-of-flight vision technology is presented. The proposed system presented complementary working requirements with respect to traditional worn and non-worn fall-detection devices. The vision system used a state-of-the-art 3D range camera for elderly movement measurement and detection of critical events, such as falls. The depth images provided by the active sensor allowed reliable segmentation and tracking of elderly movements, by using well-established imaging methods. Moreover, the range camera provided 3D metric information in all illumination conditions (even night vision), allowing the overcoming of some typical limitations of passive vision (shadows, camouflage, occlusions, brightness fluctuations, perspective ambiguity). A self-calibration algorithm guarantees different setup mountings of the range camera by non-technical users. A large dataset of simulated fall events and ADLs in real dwellings was collected and the proposed fall-detection system demonstrated high performance in terms of sensitivity and specificity.
Read full abstract