The performance of kernel-based dimensionality reduction heavily relies on the selection of kernel functions. Multiple kernel learning for dimensionality reduction (MKL-DR) has been recently proposed to learn a convex combination from a set of base kernels. But this method relaxes a nonconvex quadratically constrained quadratic programming (QCQP) problem into a semi-definite programming (SDP) problem to specify the kernel weights, which might lead to its performance degradation. Although a trace ratio maximization approach to multiple-kernel based dimensionality reduction (MKL-TR) has been presented to avoid convex relaxation, it has to compute a generalized eigenvalue problem in each iteration of its algorithm, which is expensive in both time and memory. To improve the performance of these methods further, this paper proposes a novel multiple kernel dimensionality reduction method by virtue of spectral regression and trace ratio maximization, termed as MKL-SRTR. The proposed approach aims at learning an appropriate kernel from the multiple base kernels and a transformation into a lower dimensionality space efficiently and effectively. The experimental results demonstrate the effectiveness of the proposed method in benchmark datasets for supervised, unsupervised as well as semi-supervised scenarios.