Purpose – The purpose of this work is to present a prototype of the system and the results from a technical evaluation and a study on possible effects of recordings with active camera control on the learner. An increasing number of higher education institutions have adopted the lecture recording technology in the past decade. Even though some solutions already show a very high degree of automation, active camera control can still only be realized with the use of human labor. Aiming to fill this gap, the LectureSight project is developing a free solution for active autonomous camera control for presentation recordings. The system uses a monocular overview camera to analyze the scene. Adopters can formulate camera control strategies in a simple scripting language to adjust the system’s behavior to the specific characteristics of a presentation site. Design/methodology/approach – The system is based on a highly modularized architecture to make it easily extendible. The prototype has been tested in a seminar room and a large lecture hall. Furthermore, a study was conducted in which students from two universities prepared for a simulated exam with an ordinary lecture recording and a recording produced with the LectureSight technology. Findings – The technical evaluation showed a good performance of the prototype but also revealed some technical constraints. The results of the psychological study give evidence that the learner might benefit from lecture videos in which the camera follows the presenter so that gestures and facial expression are easily perceptible. Originality/value – The LectureSight project is the first open-source initiative to care about the topic of camera control for presentation recordings. This opens way for other projects building upon the LectureSight architecture. The simulated exam study gave evidence of a beneficial effect on students learning success and needs to be reproduced. Also, if the effect is proven to be consistent, the mechanism behind it is worth to be investigated further.
Read full abstract