Introduction. Digitalization of education and healthcare is an inevitable response to the demands of society. However, it is associated with the emergence of many risks, and the need for regulation and evaluation of new online practices is becoming increasingly important. Questions about the quality of online services are becoming more and more significant, as well as the search for new methods to monitor large amounts of publicly available data about online interactions with helping professionals. The purpose of this study is to develop a model for monitoring psychological and educational online services through automated analysis of user reviews and feedback. Materials and methods. The sample for the study consisted of a variety of reviews about online psychological and educational services. A total of 24,351 reviews of psychologists' work and 6,812 reviews of tutors' work were uploaded. These reviews were ranked in order to identify levels of user satisfaction. We used the following methods to collect and analyze the data: 1) Automated data parsing to extract relevant feedback using the GraphQL API; 2) The TF-IDF (term frequency-inverse document frequency) method to identify the most important linguistic markers in the reviews; 3) The LDA (latent Dirichlet allocation) method for thematic modeling of the reviews. The results of the study. An empirical study of online reviews of psychological and educational services in accordance with this model revealed their thematic structure in four main areas: assessment of interaction (19.8%), assessment of results (26%), organization of professional interaction (20.4%), and assessment of professional skills relevant to clients (29.1%).The main characteristics that users perceive in psychological online services are: the importance of comfortable communication (21.9%), inclusiveness (13.9%), and specialist experience (5.9%). Additionally, the ability to structure a request is also significant (19.8%). The result of evaluating the psychologist's work can be seen in user reviews. Some of the most common responses include: understanding the problem (18.5%), seeing the issue from a different perspective (6.7%), and feeling better (11.3%). Assessments of the quality of online education services include accessibility and clarity of presentation (29.5%), as well as confidence and flexibility in communication (6.1%). Additionally, finding a common language with students is also important (6.2%). Conclusion. The analysis of user feedback on educational and psychological online services can help to understand the directions for the development of work by helping specialists, create a system for monitoring non-specialized assistance, describe criteria for the training and retraining of specialists, and create educational programs for developing digital competence among psychologists and educators.
Read full abstract