Abstract

The design of engineered systems is a process punctuated by design reviews pursuing various goals such as sharing a common understanding of intermediate representations, making sure that the design meets the requirements, and making informed decisions about the future of the project. In a model-based design approach, an issue of the design reviews is the difficulty to establish a common ground since participants have to understand various models that serve as intermediate representations to communicate during meetings. This observation motivates new technology-mediated design situations with no clear evidence of progress. Indeed, a wide variety of design review environments exist; however, there have been a very limited number of comparisons of these environments to date. This is mainly due to the lack of benchmark problems to evaluate candidate design review environments claiming to facilitate the understanding of model-centric designs. This paper proposes an open-science benchmark exercise that includes the definition of the pursued goals, the measures of performance, the sources of a telescope model-centric design from three viewpoints in line with the universal Function-Behaviour-Structure (FBS) ontology, and a systematic experimental protocol. This benchmark problem should enable anyone to provide objective evidence of progress regarding new environments for reviewing model-centric designs.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.