Abstract
Scientists and engineers involved in the design of complex system solutions use computational workflows for their evaluations. Along with growing system complexity, the complexity of these workflows also increases. Without integration tools, scientists and engineers are often highly concerned with how to integrate software tools and model sets, which hinders their original research or engineering aims. Therefore, a new framework for streamlining the creation and usage of automated computational workflows is introduced in the present article. It uses state-of-the-art technologies for automation (e.g., container-automation) and coordination (e.g., distributed message oriented middleware), and a microservice-based architecture for novel distributed process execution and coordination. It also supports co-simulations as part of larger workflows including additional auxiliary computational tasks, e.g., forecasting or data transformation. Using Apache NiFi, an easy-to-use web interface is provided to create, run and control workflows without the need to be concerned with the underlying computing infrastructure. Initial framework testing via the implementation of a real-world workflow underpins promising performance in the realms of parallelizability, low overheads and reliable coordination.
Highlights
Traditional science often follows the path of formulating a problem statement, creating a scientific experiment as a verification environment and running the experiment to gather data, that is analyzed to either verify or falsify made assumptions
To evaluate the behaviour and performance of the presented framework regarding parallel processing as discussed in the previous chapter, a common use case within Energy Systems Analysis benefitting from parallelization was used and executed with different numbers of parallel task instances
The framework performs significantly better in model integration than using a more tight coupling approach, which always involves sophisticated and time-consuming code base alterations
Summary
Traditional science often follows the path of formulating a problem statement, creating a scientific experiment as a verification environment and running the experiment to gather data, that is analyzed to either verify or falsify made assumptions. To gain a broader view on a system, it is typically necessary to use multiple models, simulators and other auxiliary tools (e.g., optimization tools), and combine their input and output data flows, forming a more complex computational scientific workflow for the evaluation of key system properties. Such a workflow can be formally described by defining the connection between model output and input streams and the coordinated execution of the model logic to implement sequential, iterative, or simultaneous execution logic. To achieve a higher degree of workflow automation, the model tasks as part of an automated workflow can be complemented by any other sort of auxiliary processing task, e.g., tasks performing data transformation, validation, visualization, etc
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.