Abstract

PurposeThis article describes the development processes, sampling and analysis practices and the assessment of reliability and validity of a new 0survey that sought to evaluate undergraduate students' perceptions and expectations related to privacy and library participation in learning analytics studies. This article provides other researchers with information required to independently evaluate the survey's efficacy, as well as guidance for designing other surveys.Design/methodology/approachFollowing question development, pre-survey validity assessments were made using subject matter expert panel review and cognitive interviews. Post-hoc analysis of survey construct reliability was evaluated using the Omega coefficient, while exploratory factor analysis was utilized to assess construct validity. Survey design limitations and potential bias effects are also examined.FindingsThe survey exhibited a high level of reliability among research constructs, while the exploratory factor analysis results suggested that survey constructs contained multiple conceptual elements that should be measured separately for more nuanced analysis.Practical implicationsThis article provides a model for other researchers wishing to re-use the survey described or develop similar surveys.Social implicationsAs learning analytics interest continues to expand, engaging with the subjects, in this case students, of analysis is critical. Researchers need to ensure that captured measurements are appropriately valid in order to accurately represent the findings.Originality/valueThis survey is one of very few addressing library learning analytics that has undergone extensive validity analysis of the conceptual constructs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call