While research on the brain–computer interface (BCI) has been active in recent years, how to get high-quality electrical brain signals to accurately recognize human intentions for reliable communication and interaction is still a challenging task. The evidence has shown that visually guided motor imagery (MI) can modulate sensorimotor electroencephalographic (EEG) rhythms in humans, but how to design and implement efficient visual guidance during MI in order to produce better event-related desynchronization (ERD) patterns is still unclear. The aim of this paper is to investigate the effect of using object-oriented movements in a virtual environment as visual guidance on the modulation of sensorimotor EEG rhythms generated by hand MI. To improve the classification accuracy on MI, we further propose an algorithm to automatically extract subject-specific optimal frequency and time bands for the discrimination of ERD patterns produced by left and right hand MI. The experimental results show that the average classification accuracy of object-directed scenarios is much better than that of non-object-directed scenarios (76.87% vs. 69.66%). The result of the t-test measuring the difference between them is statistically significant (p = 0.0207). When compared to algorithms based on fixed frequency and time bands, contralateral dominant ERD patterns can be enhanced by using the subject-specific optimal frequency and the time bands obtained by our proposed algorithm. These findings have the potential to improve the efficacy and robustness of MI-based BCI applications.