Monitoring stress is relevant in many areas, including sports science. In that scope, various studies showed the feasibility of stress classification using eye tracking data. In most cases, the screen-based experimental design restricted the motion of participants. Consequently, the transferability of results to dynamic sports applications remains unclear. To address this research gap, we conducted a virtual reality-based stress test consisting of a football goalkeeping scenario. We contribute by proposing a stress classification pipeline solely relying on gaze behaviour and pupil diameter metrics extracted from the recorded data. To optimize the analysis pipeline, we applied feature selection and compared the performance of different classification methods. Results show that the Random Forest classifier achieves the best performance with 87.3% accuracy, comparable to state-of-the-art approaches fusing eye tracking data and additional biosignals. Moreover, our approach outperforms existing methods exclusively relying on eye measures.