Abstract

Quality properties including performance, security and compliance are crucial for a system's success but are hard to prove, especially for complex systems. Data flow analyses support this but often only consider source code and thereby introduce high costs of repair. Data flow analyses on the architectural design level use call-and-return semantics or event-based communication between components but do not define data flows as first class entities or consider important runtime or deployment configurations. We propose introducing data flows as first class entities on the architectural level. Analyses ensure that systems meet the quality requirements even after changes in e.g. runtime or deployment configurations. Having data flows modeled as first class entities allows analyzing compliance with privacy laws, requirements for external service providers, and throughput requirements in big data scenarios on architectural level. The results allow early, cost-efficient fixing of issues.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.