PurposeThe purpose of this paper is to propose and develop a live interaction-based video player system named LIV4Smile for the improvement of the social smile in individuals with autism spectrum disorder (ASD).Design/methodology/approachThe proposed LIV4Smile intervention was a video player that operated by detecting smile using a convolutional neural network (CNN)-based algorithm. To maintain a live interaction, a CNN-based smile detector was configured and used in this system. The statistical test was also conducted to validate the performance of the system.FindingsThe significant improvement was observed in smile responses of individuals with ASD with the utilization of the proposed LIV4Smile system in a real-time environment.Research limitations/implicationsA small sample size and clinical utilizing for validation and initial training of ASD individuals for LIV4Smile could be considered under implications.Originality/valueThe main aim of this study was to address the inclusive practices for children with autism. The proposed CNN algorithm-based LIV4Smile intervention resulted in high accuracy in facial smile detection.
Read full abstract