This paper introduces a novel machine learning approach, called BBATDD (Boosting-Based Algorithm Trained with Drift Detector), that departs from traditional epoch-based methods by adopting a data stream generation strategy from the original dataset. The proposed method is versatile, applicable to a broad range of deep structures and other machine learning architectures. The key to its effectiveness is the conversion of original data into a streaming sequence, which is then fed into the neural network structure. To monitor the learning process and prevent overfitting, a drift detector, commonly used in stream data mining, is employed. Additionally, two methods, ELB and ENT, are introduced to prioritize problematic data elements during the learning process, enhancing overall model performance. Extensive simulations on benchmark datasets (MNIST and CIFAR) validate that BBATDD achieves higher accuracy with significantly fewer batches, without sacrificing generalization capability. For instance, when applied to the MNIST dataset and processing 10 000 batches while comparing loss function values, our approach demonstrates a remarkable 45.6% improvement.