This paper introduces a novel bounded loss framework for SVM and SVR. Specifically, using the Pinball loss as an illustration, we devise a novel bounded exponential quantile loss (Leq-loss) for both support vector machine classification and regression tasks. For Leq-loss, it not only enhances the robustness of SVM and SVR against outliers but also improves the robustness of SVM to resampling from a different perspective. Furthermore, EQSVM and EQSVR were constructed based on Leq-loss, and the influence functions and breakdown point lower bounds of their estimators are derived. It is proved that the influence functions are bounded, and the breakdown point lower bounds can reach the highest asymptotic breakdown point of 1/2. Additionally, we demonstrated the robustness of EQSVM to resampling and derived its generalization error bound based on Rademacher complexity. Due to the Leq-loss being non-convex, we can use the concave–convex procedure (CCCP) technique to transform the problem into a series of convex optimization problems and use the ClipDCD algorithm to solve these convex optimization problems. Numerous experiments have been conducted to confirm the effectiveness of the proposed EQSVM and EQSVR.