Abstract

Flutter has become popular for providing a uniform development environment for user interfaces (UIs) on smart phones, web browsers, and desktop applications. We have developed the Flutter programming learning assistant system (FPLAS) to assist its novice students’ self-study. We implemented the Docker-based Flutter environment with Visual Studio Code and three introductory exercise projects. However, the correctness of students’ answers is manually checked, although automatic checking is necessary to reduce teachers’ workload and provide quick responses to students. This paper presents an image-based user interface (UI) testing method to automate UI testing by the answer code using the Flask framework. This method produces the UI image by running the answer code and compares it with the image made by the model code for the assignment using ORB and SIFT algorithms in the OpenCV library. One notable aspect is the necessity to capture multiple UI screenshots through page transitions by user input actions for the accurate detection of changes in UI elements. For evaluations, we assigned five Flutter exercise projects to fourth-year bachelor and first-year master engineering students at Okayama University, Japan, and applied the proposed method to their answers. The results confirm the effectiveness of the proposal.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.