In Data Quality

During the 2000 presidential election, massive problems with voter data quality resulted in national anxiety because there was no clear winner. Poor presentation on ballots led to errors in data processing, which, in turn, led to flawed predictions, recounts and lawsuits.

Nike’s 2001 switch to a new demand and supply inventory system picked up flaws from the old system and triggered massive over-production and under-production errors – the result of bad data from the old system interacting poorly with the new system. Nike’s stock tumbled 25% and they lost $100 million in a single quarter.

These are just a couple of examples where bad data resulting from poor maintenance, poor data collection and flawed data structure created data bombs that went off with costly results. Is your company paying attention to the quality of the data that’s collecting in its databases? Are you building a data bomb that’s sitting there waiting to explode with devastating (and expensive) consequences?

Causes of Poor Data Quality:

The Bad News – Ironically, the technology that was supposed to make our data better and more useful is part of the problem. As we become increasingly dependent on automated processes we sometimes fail to pay sufficient attention to how that data is stored and maintained. We must take into account that legacy data may not only be dated, but also flawed in the way it was originally structured.

In the early days of computer technology, those who built databases often created custom work-arounds to cope with limitations of existing database software and limited storage space. Bank routing, customer account and shipping invoice numbers wound up stuffed into integer fields of limited size, for instance. Later as these numbers became longer, they no longer fit. As new data systems have come on-line, such cobbled together solutions come back to haunt.

The Good News – Technology changes that are part of the problem also include tools that make detecting data flaws more efficient.

More Bad News – The changing nature of business has set up information systems for disaster. The post-millennial penchant for large scale mergers and acquisitions often forces dissimilar information systems belonging to multiple companies into single blended data systems. When data is inaccurate and outdated or rife with duplications and inconsistencies, the potential for creating potentially lethal data bombs is magnified geometrically.

The Good News – The technology is available for ferreting out bad data and the process of blending multiple data systems provides a perfect opportunity to do it.  It can be costly and time-consuming, but the merger process is expected to cost the company some money. Mergers can provide information systems designers the opportunity to “get it right this time”.

Even More Bad News – One of the reasons companies become targets for merger and acquisition is organizational structure failure.  As a company experiences breakdowns in established procedures, data quality processes may not be followed carefully. The chaos that ensues contributes to poor quality data.  It’s not at all uncommon for a company that experiences a hostile takeover or a rescue takeover to bring with it a truckload of flawed data.

The Good News – Again, a reorganization of a company is a good time to sell the administration on a data quality improvement project. During the spending spree that often follows an ownership change or merger, information systems managers need to think about more than just upgrading their computers. They need to seize the opportunity to upgrade their data quality while they are at it.

~