Abstract

This paper proposes a novel algorithm for incremental learning over streaming data in a non-stationary environment. The idea refers to the applicability of Probabilistic Neural Networks (PNNs), commonly used as a fast and robust method for solving classification problems in a stationary scenario. It is well known that PNNs have solid mathematical foundations but they fail in the non-stationary scenario and are not able to deal with concept drift. In this paper, the method based on the orthogonal series expansions of unknown probability densities is proposed. This approach significantly extends the applicability of the PNNs. The nonparametric procedures called Incremental Probabilistic Neural Networks (IPNNs) for tracking drifting probability densities or conditional class densities, in the case of classification, are presented. We prove their convergence as the stream data size is growing to infinity. More precisely, we show convergence in probability and with probability one as the sample size tends to infinity. Compared to the existing literature, almost exclusively based on various heuristics, the proposed method is mathematically justified and has solid theoretical foundations. Simulations performed on synthetic and air pollution data confirm the effectiveness of the algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call