Non-linear classification machines seldom are trained under criteria that are usual and useful for linear discriminants, such as minimax, Fisher’s, and other similar criteria. The reason is the learning difficulties that transformation-trainable machines suffer when applying such criteria. However, the possibility of using non-linear machines whose transformations are pre-designed merits attention.In this contribution, we propose and study an efficient and potentially effective option: Applying Disjoint Tangent Configurations (DTC), a formulation that includes discriminants such as Fisher’s, Bayes for normal distributions, Minimax Probabilistic Decision Hyperplane (MPDH), and others, to the output of a Radial Basis Function (RBF) network which has been previously designed with a moderate number of nodes to reduce the computational load, but with a high quality centroid selection algorithm, Frequency Sensitive Competitive Learning (FSCL), which allows to obtain networks with high representation capabilities. Experiments demonstrate that this approach leads to good performance results with acceptable computational efforts.