Traditional hinge loss function-based large-scale support vector machine (SVM) algorithms tend to perform poorly in the presence of noise, especially when the model is trained incrementally. In this paper, we propose an efficient stochastic quasi-Newton method-based twin parametric SVM using the pinball loss function (termed as SQN-PTWSVM), which is efficient and more robust to the presence of noise when compared to conventional hinge loss SVM for large-scale data scenarios. To establish the theoretical convergence of the method, a modified version of SQN-PTWSVM, termed as SQN-SPTWSVM, has also been proposed. It overcomes the poor convergence issue faced by stochastic gradient twin SVM thus resulting in a faster and reliable model. In SQN-SPTWSVM, the hyperplanes obtained are stable enough to handle noise and resampling issues that occur frequently in stochastic learning scenarios, leading to better generalization ability of the classifier. The proposed method has been extended to nonlinear scenarios as well. Moreover, batch versions of the proposed algorithms have also been introduced which significantly reduce the training time and memory requirement of SQN-PTWSVM and SQN-SPTWSVM. The experimental results on several benchmark datasets and activity recognition applications have shown that the performance of our method is better than the existing classifiers in terms of speed and accuracy.