In high dimensional statistical learning, variable selection and handling highly correlated phenomena are two crucial topics. Elastic-net regularization can automatically perform variable selection and tends to either simultaneously select or remove highly correlated variables. Consequently, it has been widely applied in machine learning. In this paper, we incorporate elastic-net regularization into the support vector regression model, introducing the Elastic-net Support Vector Regression (En-SVR) model. Due to the inclusion of elastic-net regularization, the En-SVR model possesses the capability of variable selection, addressing high dimensional and highly correlated statistical learning problems. However, the optimization problem for the En-SVR model is rather complex, and common methods for solving the En-SVR model are challenging. Nevertheless, we observe that the optimization problem for the En-SVR model can be reformulated as a convex optimization problem where the objective function is separable into multiple blocks and connected by an inequality constraint. Therefore, we employ a novel and efficient Alternating Direction Method of Multipliers (ADMM) algorithm to solve the En-SVR model, and provide a complexity analysis as well as convergence analysis for the algorithm. Furthermore, extensive numerical experiments validate the outstanding performance of the En-SVR model in high dimensional statistical learning and the efficiency of this novel ADMM algorithm.