Administratively incorrect, inconsistent data can lead to false conclusions and misdirect investments on both public and private scales. For instance, the government may want to analyze population census figures to decide which regions require further spending and investment on infrastructure and services. In this case, it will be important to have access to reliable data to avoid erroneous fiscal decisions. In the business world, incorrect data can be costly. Many companies use customer information databases that record data like contact information, addresses, and preferences. For instance, if the addresses are inconsistent, the company will suffer the cost of resending mail or even losing customers.
High-quality data needs to pass a set of quality criteria. Those include:
The term integrity encompasses accuracy, consistency and some aspects of validation (see also Data integrity) but is rarely used by itself in data-cleansing contexts because it is insufficiently specific. (For example, "referential integrity" is a term used to refer to the enforcement of foreign-key constraints above.)
Good quality source data has to do with “Data Quality Culture” and must be initiated at the top of the organization. It is not just a matter of implementing strong validation checks on input screens, because almost no matter how strong these checks are, they can often still be circumvented by the users. There is a nine-step guide for organizations that wish to improve data quality:34
Others include:
The essential job of this system is to find a balance between fixing dirty data and maintaining the data as close as possible to the original data from the source production system. This is a challenge for the extract, transform, load architect. The system should offer an architecture that can cleanse data, record quality events and measure/control the quality of data in the data warehouse. A good start is to perform a thorough data profiling analysis that will help define the required complexity of the data cleansing system and also give an idea of the current data quality in the source system(s).
Part of the data cleansing system is a set of diagnostic filters known as quality screens. They each implement a test in the data flow that, if it fails, records an error in the Error Event Schema. Quality screens are divided into three categories:
When a quality screen records an error, it can either stop the dataflow process, send the faulty data somewhere else than the target system or tag the data. The latter option is considered the best solution because the first option requires, that someone has to manually deal with the issue each time it occurs and the second implies that data are missing from the target system (integrity) and it is often unclear what should happen to these data.
Most data cleansing tools have limitations in usability:
The error event schema holds records of all error events thrown by the quality screens. It consists of an error event fact table with foreign keys to three dimension tables that represent a date (when), batch job (where), and screen (who produced error). It also holds information about exactly when the error occurred and the severity of the error. Also, there is an error event detail fact table with a foreign key to the main table that contains detailed information about in which table, record and field the error occurred and the error condition.
Wu, S. (2013), "A review on coarse warranty data and analysis" (PDF), Reliability Engineering and System, 114: 1–11, doi:10.1016/j.ress.2012.12.021 https://kar.kent.ac.uk/32972/1/LatestVersionV01.pdf ↩
"Data 101: What is Data Harmonization?". Datorama. 14 April 2017. Archived from the original on 24 October 2021. Retrieved 14 August 2019. https://web.archive.org/web/20211024062426/https://datorama.com/blog/2017/04/14/data-101-what-is-data-harmonization/ ↩
Kimball, R., Ross, M., Thornthwaite, W., Mundy, J., Becker, B. The Data Warehouse Lifecycle Toolkit, Wiley Publishing, Inc., 2008. ISBN 978-0-470-14977-5 /wiki/ISBN_(identifier) ↩
Olson, J. E. Data Quality: The Accuracy Dimension", Morgan Kaufmann, 2002. ISBN 1-55860-891-5 /wiki/Morgan_Kaufmann ↩