Abstract

Web data have grown exponentially to reach zettabyte scales. Mountains of data come from several online applications, such as e-commerce, social media, web and sensor-based devices, business web sites, and other information types posted by users. Big data analytics (BDA) can help to derive new insights from this huge and fast-growing data source. The core advantage of BDA technology is in its ability to mine these data and provide information on underlying trends. BDA, however, faces innate difficulty in optimizing the process and capabilities that require merging of diverse data assets to generate viable information. This paper explores the BDA process and capabilities in leveraging data via three case studies who are prime users of BDA tools. Findings emphasize four key components of the BDA process framework: system coordination, data sourcing, big data application service, and end users. Further building blocks are data security, privacy, and management that represent services for providing functionality to the four components of the BDA process across information and technology value chains.

Highlights

  • Big data have seen exponential growth over the past few decades, changing the information availability landscape and everything it interacts with

  • The big data analytic framework can be considered as a generic big data system model

  • It represents the logical functional components of a common, technology-independent big data system with interoperability interfaces between components, which can be used as a general technical reference framework for developing various specific types of big data application system architectures

Read more

Summary

Introduction

Big data have seen exponential growth over the past few decades, changing the information availability landscape and everything it interacts with. The big data concept was primarily introduced in the 1990s [1] but it was not until the early 21st century when it had its revolutionary breakthrough, evolving from decision support and business intelligence (BI) systems [2]. Data are being generated at phenomenal rates. Recent research reports that only 5 exabytes of data were generated by humans when the concept of big data first gained emphasis [4]. This many data can be created in a day or two. In 2010, the world generated over 1 zettabytes of data, which was expanded to 2.7 zettabytes in 2012

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call