Kernel ridge regression (KRR) is a widely accepted efficient machine learning paradigm that has been fruitfully implemented for solving both classification and regression problems. KRR solves a set of linear equations instead of solving a quadratic programming problem. However, KRR gives equal importance to each sample which leads to giving the same significance to the important and non-important samples. That might lead to low classification accuracy. To resolve this issue, this paper suggests a novel kernel ridge regression based on intuitionistic fuzzy membership (IFKRR) for binary classification. In IFKRR, there is an intuitionistic fuzzy number linked to each training sample which is framed by either its membership or non-membership. A pattern’s membership degree considers its distance from the corresponding class center. However, the non-membership degree is provided by the ratio of the number of heterogeneous points to the total number of its neighborhood points. The proposed IFKRR model can efficiently reduce the influence of noise in datasets. To evaluate the efficiency of IFKRR, its performance is compared with support vector machine (SVM), twin SVM (TWSVM), intuitionistic fuzzy SVM (IFSVM), intuitionistic fuzzy TWSVM (IFTSVM), random vector functional link with univariate trees (RFL), KRR and Co-trained KRR average (CoKRR-avg) on an artificial and a few really interesting real world datasets using Gaussian kernel. Computational results reveal the efficacy of the IFKRR model on the real world as well as noisy datasets.