Abstract

The tool for automatically executing and validating test sets of actions is not native to applications, which causes problems with recognizing interaction elements due to the abundance or, conversely, lack of GUI elements. This work describes the design and development of an internal tool that provides accurate recognition of interaction elements by decorating the input reading mechanisms of the Unity game engine. Previously, there were no solutions for Unity to automate success tests, testing the main flow of a use case, or other approaches for functional testing of applications, except through the mechanisms of the standard unit testing package. However, no matter how good the approach of using unit testing tools is, in order to cover all possible scenarios of actions within the project, you would have to either write your own test module for each of the aspects, or implement a complex universal system on its basis. The article's graceful workaround for the technical limitation of the standard Input class dramatically reduces the time required for the application testing phase. In addition to solving testing problems, such a form of recording user actions in a virtual environment can be used to generate training simulator scenarios, which will help to avoid the routine work of generating training tracks. So, immersion in a virtual biotechnology laboratory of an expert who can correctly pass all phases of the selected analysis (for example, ELISA) is enough to record the checkpoints of such a scenario for teaching students.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call