Mixed integer programming (MIP) is commonly used to model indicator constraints, i.e., constraints that either hold or are relaxed depending on the value of a binary variable. Unfortunately, those models tend to lead to weak continuous relaxations and turn out to be unsolvable in practice; this is what happens, for e.g., in the case of Classification problems with Ramp Loss functions that represent an important application in this context. In this paper we show the computational evidence that a relevant class of these Classification instances can be solved far more efficiently if a nonlinear, nonconvex reformulation of the indicator constraints is used instead of the linear one. Inspired by this empirical and surprising observation, we show that aggressive bound tightening is the crucial ingredient for solving this class of instances, and we devise a pair of computationally effective algorithmic approaches that exploit it within MIP. One of these methods is currently part of the arsenal of IBM-Cplex since version 12.6.1. More generally, we argue that aggressive bound tightening is often overlooked in MIP, while it represents a significant building block for enhancing MIP technology when indicator constraints and disjunctive terms are present.