Abstract

The problems associated with the processing of large amounts of data, initiated research in the field of creating special software that allows us to process this data online. A well-known example of such software is the MapReduce computational model developed and implemented by Google. The advantages of MapReduce are the high processing speed of large data arrays, achieved through data decomposition and reduction, as well as the ability to implement this model on standard hardware. Creating algorithms and programs that comply with the principles of the MapReduce model, depends on the specifics of the tasks that are solved, and relies on the software developers. Most of the algorithms known today are designed to process large arrays of data coming to a computer system online without changing data models (i.e. the data is processed as it enters the system in the data stream). At the same time, it is possible to distinguish classes of tasks for which the data on the objects under study are redundant, and their volume can be significantly reduced even before this data is available for their transformation. As is shown in the article, this class includes the tasks of mathematical simulation of complex engineering objects, the data models of which are represented in the form of mathematical equations that describe the physical states of the objects. The authors discuss the problems of decomposition and reduction of models at the level of transformations of the mentioned mathematical equations, which is why the authors call this approach algorithmic decomposition and reduction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call