Abstract

Data streams are being generated in a faster, bigger, and more commonplace manner. In this scenario, Hoeffding Trees are an established method for classification. Several extensions exist, including high-performing ensemble setups such as online and leveraging bagging. Also, k-nearest neighbours is a popular choice, with most extensions dealing with the inherent performance limitations over a potentially-infinite stream. At the same time, gradient descent methods are becoming increasingly popular, owing to the proliferation of interest and successes in deep learning. Although deep neural networks can learn incrementally, they have so far proved too sensitive to hyperparameter options and initial conditions to be considered an effective 'off-the-shelf' data streams solution. In this work, we look at combinations of Hoeffding trees, nearest neighbour, and gradient descent methods with a streaming preprocessing approach in the form of a random feature functions filter for additional predictive power. Our empirical evaluation yields positive results for the novel approaches that we experiment with, and also highlight important issues, and shed light on promising future directions in approaches to data stream classification.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.