Abstract

In the era of Big Data, many NoSQL databases emerged for the storage and later processing of vast volumes of data, using data structures that can follow columnar, key-value, document or graph formats. For analytical contexts, requiring a Big Data Warehouse, Hive is used as the driving force, allowing the analysis of vast amounts of data. Data models in Hive are usually defined taking into consideration the queries that need to be answered. In this work, a set of rules is presented for the transformation of multidimensional data models into Hive tables, making available data at different levels of detail. These several levels are suited for answering different queries, depending on the analytical needs. After the identification of the Hive tables, this paper summarizes a demonstration case in which the implementation of a specific Big Data architecture shows how the evolution from a traditional Data Warehouse to a Big Data Warehouse is possible.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call