As virtual reality (VR) technology advances, research has focused on enhancing VR content for a more realistic user experience. Traditional emotion analysis relies on surveys, but they suffer from delayed responses and decreased immersion, leading to distorted results. To overcome these limitations, we propose an emotion analysis method using sensor data in the VR environment. Our approach can take advantage of the user’s immediate response and not reduce immersion. Linear regression, classification analysis, and tree-based methods were applied to electrocardiogram and galvanic skin response (GSR) sensor data to measure valence and arousal values. We introduced a novel emotional dimension model by analyzing correlations between emotions and the valence and arousal values. Experimental results demonstrated the highest accuracy of 77% and 92.3% for valence and arousal prediction, respectively, using GSR sensor data. Furthermore, an accuracy of 80.25% was achieved in predicting valence and arousal using nine emotions. Our proposed model improves VR content through more accurate emotion analysis in a VR environment, which can be useful for targeting customers in various industries, such as marketing, gaming, education, and healthcare.