Abstract

This chapter focuses on providing a frame of reference for defining metrics associated with data quality. It also shows how the dimensions encompass the intrinsic aspects of data quality, and the ways that metrics can be generally applied to data quality monitoring within an enterprise. The goal is to provide a medium to communicate the confidence that business is not being impacted by violation of the data quality rules and to show how a trending improvement or regression in data quality compliance relates to operational efficiency or competitive advantage. The ability to support these goals relies on defining key data quality performance metrics and associating a set of rules that roll up into those metrics. By defining data quality rules whose observance can be measured at various levels across the enterprise information architecture, the data quality practitioner can assemble scorecards for evaluating stability and predictability associated with data quality measurements, and can differentiate common causes from special causes of data failures. Statistical process control is an analysis tool used for measuring and charting the conformance of information to a set of data quality rules. This is useful in resolving data quality issues based on an objective assessment of an organization's level of data quality maturity with respect to those dimensions of data quality.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call