Abstract

With the exponential expansion of the interconnected world, we have large volume, variety and velocity of the data flowing through the systems. The dependencies on these systems have crossed the threshold of business value, and now such communications have started to be classified as essential systems. As such, these systems have become vital social infrastructure that needs all of prediction, monitoring, safe guard and immediate decision-making in case of threats. The key enabler is data stream analytics (DSA). In DSA, the key areas of stream processing constitute prediction and forecasting, classification, clustering, mining frequent patterns and finding frequent item sets (FISs), detecting concept drift, building synopsis structures to answer standing and ad hoc queries, sampling and loadshedding in the case of bursts of data and processing data streams emanating from a very large number of interconnected devices typical for Internet-of-Things (IoT). The processing complexity is impacted by the multidimensionality of the stream data objects, building `forgetting' as a key construct in the processing, leveraging the time-series aspect to aid the processing and so on. In this chapter, we explore some of the aforementioned areas and provide a survey in each of these selected areas. We also provide a survey on the data stream processing systems (DSPSs) and frameworks that are being adopted by the industry at large.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call