Spatial attention is an important feature for filtering everyday inputs. The direction of the attention can be guided by the use of visual, auditory or tactile stimuli. The literature regarding the effect of cueing spatial attention in visual search tasks consistently shows an improvement in accuracy and reaction time. So far, most studies have used two-dimensional setups, for which ecological validity may be questioned. In this study with healthy participants, we investigated the feasibility of a virtual reality-based setup. We examined the feasibility and compared the performance in a visual search task as auditory, tactile or combined cues were given. The results revealed high usability and a significantly higher detection rate for combined audio-tactile cues compared to auditory cues alone.