Abstract

Professionals in a discipline often interact with other professionals to help them keep up to date in their field, to overcome impasses, to answer questions, in short to meet their knowledge needs. Such professionals are essentially engaged in lifelong learning, and the platform that helps them interact with each other essentially supports a community of professional learners. In our research we have been studying one such community, the community of programmers supported by Stack Overflow (SO), with the ultimate goal of diagnosing the knowledge needs of the SO users in such an open ended and evolving learning environment. In this paper, we report on a study that is a step in the direction of achieving this goal. In particular we diagnosed the knowledge of users in SO to see if their performance level in answering questions could be predicted from their previous behavior. We used a tag-based knowledge model and a Naive Bayes model in making predictions. We measured the success of our predictions using 10-fold cross validation, root mean square deviation, and mean absolute error. Over different sample sizes and different numbers of tags, we achieved prediction accuracy ranging between 84.644% and 91.709%, root mean square error ranging between 0.0517 and .0629, and mean absolute error ranging between 0.011 and .0115. This level of success suggests the potential to provide adaptive feedback about an individual’s knowledge needs even before poor answers are provided. The approach has the further advantages of being lightweight (requiring minimal knowledge engineering) and of having the potential to evolve naturally with changes in the learner’s knowledge and changes in the disciplinary knowledge.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call