The Synthetic Minority Oversampling Technique (SMOTE) is a popular preprocessing method for handling the imbalance learning problem, which is one of the main challenges in the field of data mining. In this paper, we claim that the main issue exists in SMOTE and its variants is the tradeoff between overfitting and noise prevention. To overcome this issue, we propose a novel method named LSMOTE, in which synthetic minority examples are generated based on a link-based approach and noise filtering for binary imbalanced datasets. Firstly, a data smoothing method is used to find the outlier points from the majority class. Then, the linked regions are recognized where a Gaussian process is employed for generating synthetic examples. Finally, due to the adventurous interpolation strategy, a simple noise filtering method is employed to remove the potential noisy synthetic examples. The experimental results based on 55 imbalanced datasets selected from the KEEL dataset repository reveal the good behavior of our proposed LSMOTE, achieving the best classification performance with respect to classic SMOTE and several state-of-the-art SMOTE variants.