Abstract

As we daily use smartphones as like as televisions and cars, smartphones are usually used to access and communicate with learning contents and lecturers. And through smartphone, various types of multimedia contents can interact with users, since there are many types of sensors equipped with smartphone. Information and sensor technology are very precious that users can be detected their movement, emotion and learning status. Especially, most of learners (above 90% of Korea National Open University students) access to learning contents with their smartphones in the mobile learning environment. But, there is few methods, technology and service model for learners’ learning status tracking, only focusing degree and learning concentration level. In this paper, we propose learning reaction model and criteria for learning state. For simple approach, we propose sensor-based metrics with smartphone and the learning emotional reaction model between learners and learning contents is based on learner’s learning actions and learner’s sage state for a smartphone. Proposed Affective Learning Reaction Model(ALRM) consists of learner’s learning emotional state, learners’ personal environment state and learner’s learning activity patterns. Each state has both mandatory and optional information. Mandatory information is important metrics to decide learners’ learning emotion state. And Optional information is complimentary information. Those metrics have their own value and sum of them decides learner’s learning state. With the ARLM, personalized and modified learning contents can be delivered to a learner.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call