Virtual Reality (VR) technology has shown promise in simulating music scenes for emotion regulation. The challenge is to enhance the realism and interactivity of VR-based music experiences to create a more immersive and emotionally impactful environment for users. This manuscript proposes an augmented physics-informed neural network and seahorse optimization algorithm for music scene simulation and emotion regulation in virtual reality technology (APINN-SHO-MSS-ERVRT). Initially, the data is collected via head-mounted display (HMD) in real time basis. Afterward, the data is fed to data-adaptive Gaussian average filtering (DAGAF) based pre-processing process. In the pre-processing segment, it improves virtual presence. The pre-processing output is given to augmented physics-informed neural network (APINN)and seahorse optimization algorithm (SHO) for effectively classifying the facial expression for fear and null. Therefore, SHO is proposed to enhance weight parameter of augmented physics-informed neural network classifier, which precisely classifies the facial emotion classifier. The proposed method is implemented in MATLAB and evaluated their performance with existing methods. The performance metrics, like accuracy, precision, sensitivity, specificity, computational time and ROC is analysed to the performance of the proposed method. The proposed APINN-SHO-MSS-ERVRT methods of accuracy are provide 90% higher accuracy for fear and 97% higher accuracy for null emotion is analysed when compared with existing methods GERR-NN-AMSVR, ER-VR-EEG and MTS-VRS-CDI. Proposed APINN-SHO-MSS-ERVRT method attains 0.87%, 0.88% and 0.89%higher ROC analysed to the existing methods respectively.
Read full abstract