Abstract

Modelling languages have proved to be an effective tool to specify and analyse various perspectives of enterprises and information systems. In addition to modelling language designs, works on model quality and modelling language quality evaluation have contributed to the maturity of the model-driven engineering (MDE) field. Although consolidated knowledge on quality evaluation is still relevant to this scenario, in previous works, we have identified misalignments between the topics that academia is addressing and the needs of industry in applying MDE, thus identifying some remaining challenges. In this paper, we focus on the need for a method to evaluate the quality of a set of modelling languages used in combination within a MDE environment. This paper presents MMQEF (Multiple Modelling language Quality Evaluation Framework), describing its foundations, presenting its method components and discussing its trade-offs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call