We investigated how attention-demanding aural and visual discrimination tasks attenuate visually-induced self-motion (vection) and how task accuracy and response time are affected by experiencing various levels of vection-inducing motion in a virtual environment. Seventeen seated observers were presented simulated motion at various virtual camera speeds from stationary to 15 m/s in a straight virtual corridor through a Vive Pro Virtual Reality headset as they performed aural, visual discrimination tasks, or no task at all. Observers generally perceived less vection at all motion levels when they performed visual discrimination tasks compared to when they had no task. Increased vection was associated with reduced accuracy on the visual task and increased response time to the aural task. These results suggest that the amount of vection perceived in virtual reality simulators can be attenuated when users perform attention-demanding tasks related to visual processing, and, conversely, vection-producing motion can affect performance in attention-demanding tasks.