Measuring Data Quality for Ongoing Improvement
Morgan Kaufmann Publishers In (Verlag)
978-0-12-397033-6 (ISBN)
The Data Quality Assessment Framework shows you how to measure and monitor data quality, ensuring quality over time. You’ll start with general concepts of measurement and work your way through a detailed framework of more than three dozen measurement types related to five objective dimensions of quality: completeness, timeliness, consistency, validity, and integrity. Ongoing measurement, rather than one time activities will help your organization reach a new level of data quality. This plain-language approach to measuring data can be understood by both business and IT and provides practical guidance on how to apply the DQAF within any organization enabling you to prioritize measurements and effectively report on results. Strategies for using data measurement to govern and improve the quality of data and guidelines for applying the framework within a data asset are included. You’ll come away able to prioritize which measurement types to implement, knowing where to place them in a data flow and how frequently to measure. Common conceptual models for defining and storing of data quality results for purposes of trend analysis are also included as well as generic business requirements for ongoing measuring and monitoring including calculations and comparisons that make the measurements meaningful and help understand trends and detect anomalies.
Laura Sebastian-Coleman, Data Quality Director at Prudential, has been a data quality practitioner since 2003. She has implemented data quality metrics and reporting, launched and facilitated working stewardship groups, contributed to data consumer training programs, and led efforts to establish data standards and manage metadata. In 2009, she led a group of analysts in developing the Data Quality Assessment Framework (DQAF), which is the basis for her 2013 book, Measuring Data Quality for Ongoing Improvement. An active professional, Laura has delivered papers, tutorials, and keynotes at data-focused conferences, such as MIT’s Information Quality Program, Data Governance and Information Quality (DGIQ), Enterprise Data World (EDW), Data Modeling Zone, and Data Management Association (DAMA)-sponsored events. From 2009 to 2010, she served as IAIDQ’s Director of Member Services. In 2015, she received the IAIDQ Distinguished Member Award. DAMA Publications Officer (2015 to 2018) and production editor for the DAMA-DMBOK2 (2017), she is also author of Navigating the Labyrinth: An Executive Guide to Data Management (2018). In 2018, she received the DAMA award for excellence in the data management profession. She holds a CDMP (Certified Data Management Professional) from DAMA, an IQCP (Information Quality Certified Professional) from IAIDQ, a Certificate in Information Quality from MIT, a B.A. in English and History from Franklin & Marshall College, and a Ph.D. in English Literature from the University of Rochester.
Section One: Concepts and Definitions
Chapter 1: Data
Chapter 2: Data, People, and Systems
Chapter 3: Data Management, Models, and Metadata
Chapter 4: Data Quality and Measurement
Section Two: DQAF Concepts and Measurement Types
Chapter 5: DQAF Concepts
Chapter 6: DQAF Measurement Types
Section Three: Data Assessment Scenarios
Chapter 7: Initial Data Assessment
Chapter 8 Assessment in Data Quality Improvement Projects
Chapter 9: Ongoing Measurement
Section Four: Applying the DQAF to Data Requirements
Chapter 10: Requirements, Risk, Criticality
Chapter 11: Asking Questions
Section Five: A Strategic Approach to Data Quality
Chapter 12: Data Quality Strategy
Chapter 13: Quality Improvement and Data Quality
Chapter 14: Directives for Data Quality Strategy
Section Six: The DQAF in Depth
Chapter 15: Functions of Measurement: Collection, Calculation, Comparison
Chapter 16: Features of the DQAF Measurement Logical
Chapter 17: Facets of the DQAF Measurement Types
Appendix A: Measuring the Value of Data
Appendix B: Data Quality Dimensions
Appendix C: Completeness, Consistency, and Integrity of the Data Model
Appendix D: Prediction, Error, and Shewhart’s lost disciple, Kristo Ivanov
Glossary
Bibliography
Erscheint lt. Verlag | 20.2.2013 |
---|---|
Reihe/Serie | The Morgan Kaufmann Series on Business Intelligence |
Verlagsort | San Francisco |
Sprache | englisch |
Maße | 191 x 235 mm |
Gewicht | 750 g |
Themenwelt | Informatik ► Datenbanken ► Data Warehouse / Data Mining |
Wirtschaft ► Betriebswirtschaft / Management | |
ISBN-10 | 0-12-397033-4 / 0123970334 |
ISBN-13 | 978-0-12-397033-6 / 9780123970336 |
Zustand | Neuware |
Haben Sie eine Frage zum Produkt? |
aus dem Bereich