Abstract
Learnability is one important aspect of user interaction that measures how long a user needs to familiarize themselves with the software. The evaluation method using expert analysis or user questionnaire cannot fully capture the learnability aspect of a software. Automated testing can record the user performance data and provide an objective evaluation of learnability. However, embedding recording code to conduct automated test can be expensive. This work proposes a novel method of automatic testing to evaluate the learnability of an existing software. By using Figma and Maze apps, a replica of evaluated software is made and injected with users’ performance recording module with much ease. The result of the experiment shows that learnability data can be acquired objectively. In the experiment, the user of evaluated software requires an average learning rate of 3 iterations. While the average completion time is around 2.37 seconds per action for trained respondents and 1.86 seconds for untrained respondents.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.