Abstract
Students often utilize audio media during online or offline courses. However, lecture audio data are mostly unstructured and extensive, so they are more challenging in information browsing (i.e., chaining, linking, extraction, and evaluation of relevant information). Conventional time-level skip control is limited in auditory information browsing because it is hard to identify the current position and context. This paper presents HearIt, which provides semantic-level skip control with auditory cues for auditory information browsing. With HearIt, users can efficiently change the playback position in the paragraph-level. Furthermore, two auditory cues (positional cue and topical cue) help grasp the current playback and its context without additional visual support. We conducted a pilot study with the prototype of HearIt, and the results show its feasibility and design implications for future research.
Highlights
Nowadays, audio contents contain various helpful information
The results show that the proposed method is significantly efficient and effective in auditory information browsing without visual supports
All the participants mentioned that they felt considerable limitations in auditory information browsing without visual support
Summary
In many classrooms, students record lectures and use them for their active learning. Such audio recording and re-playing are essential for visually-impaired people. Many people with visual impairment benefit from auditory guidance for their daily activities such as studying [1], filling out a form [2], taking a pictures [3,4]. There have been many studies to support information access of visually-impaired people. Some studies have proposed tools that enable users to process information by converting written texts to spoken text. FingerReader [8] reads aloud printed-texts to help blind users aware of the information
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have