Abstract

Data quality plays a vital role in the reliability of data for planning and decision making. The methods used for data collection and entry further heightens the concern for data quality. This paper addresses the techniques of double data entry as an efficient and simplified approach designed to improve the quality of paper-based records. Data from implementation of the operations research component of a larger health intervention project in Abia State, South-east, Nigeria, was used for the implementation. Paper-based data were entered independently by two data entry clerks with unique identifiers (IDs) using ODK application. The data was then exported in .csv format into Microsoft Excel application and compared for discordant entries. The algorithm auto-compared all records by the data entry clerks and returned zero and non-zero values for all concordant and discordant entries respectively. This allowed for easy spot checks on the questionnaires and subsequent correction of the erroneous entries. Double data entry is efficient, cost effective and robust in achieving high data quality with paper-based records.

Highlights

  • IntroductionResearches are carried out to discover truths, validate hypothesis, increase knowledge, and foster better and evidence-based solutions to established problems of interest

  • Data quality plays a vital role in the reliability of data for planning and decision making

  • This paper addresses the techniques of double data entry as an efficient and simplified approach designed to improve the quality of paper-based records

Read more

Summary

Introduction

Researches are carried out to discover truths, validate hypothesis, increase knowledge, and foster better and evidence-based solutions to established problems of interest. To minimize the introduction of errors and ensure that quality of research data is guaranteed, different approaches are often adopted by researchers at various stages of the study Integrity measures such as controls, built-in logics and validation checks are frequently applied in data collection and entry processes to minimize the occurrence of errors. Notwithstanding the integrity measures, errors and omissions are still common with data entry activities, certainly due to known inevitable factors such as speed of entry, fatigue, age and level of experience of the data clerks (Scott, Thompson, Wright-Thomas, Xu, & Barchard, 2008) These data entry errors can sometimes be so severe as to invalidate inferences and conclusion from results of the study. Double data entry is one step that has been shown to reduce data entry errors and improve the quality of research outputs (Coleman Data Solutions, 2014)

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call