Abstract

The multizone airflow simulation program COMIS was evaluated within an International Energy Agency research program. One of the steps in the evaluation procedure is to test the user–code interface, consisting not only in the appearance of the computer screen, but also in the user guide or any other tutorial or help system. The user–code interface of COMIS was then tested through round robin tests. Two types of problems were submitted to several users: a simple and well-defined problem and a real world problem. This study first allowed great improvements of the user guide. While results for the well-defined case were very close to each other, large differences were observed for the `real world' case. Results of simulation largely depend on the user options, and users easily make modelling errors when the studied case becomes complex.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call