$3.1 trillion – IBM’s 2016 estimate of annual cost related to bad data in the US
75% – estimated total cost attributed to data issues
50% – estimated time knowledge workers waste dealing with data issues
60% – estimated time data scientists spend on cleaning and organizing data
Ovum Research estimates that dealing with poor data quality is costing companies at least 30% of their revenue.
These numbers published by Harvard Business Review (HBR) highlight how inefficient companies are in using data to drive decisions. They also reveal a great opportunity for sound data quality as a source of competitive advantage for companies.
Reduced productivity and loss of customers are just a couple of the consequences of poor data quality that companies suffer from. The impact could be direct to the business or indirect by diverting resources manage bad data.
Let’s elaborate on how poor data quality reduces productivity. As the HBR articles noted, an inordinate amount of time and resource is spent by companies on dealing with data issues. Not only is your team operating at half speed, you also must consider budgeting for data consultants or employ data scientists. But it doesn’t just stop there. Data usually propagates from one department to the next even worse may reach down to customers. Instead of your employees focusing on growing your company, they are spending half the time toiling away in a “hidden data factory”, correcting each other’s errors.
On the other hand, good data provides clarity into the market and your own company. You can leverage clean data as a competitive advantage to free up time to focus on improving your market position and establishing customer loyalty. You can also improve your processes and resource allocation by leveraging clean data. To put it simply, poor data quality leads to missed opportunities to grow and compete.
Poor data quality is not just about the operational inefficiencies and direct financial costs. As the hidden data factory concept indicates, if data issues are not identified and remediated effectively, they can harm your company’s reputation to customers, suppliers and partners. For example, bad customer data can lead to a misunderstanding of the customer’s needs.
Even if data issues don’t reach customers, poor data quality can brew internal mistrust. Your employees will become disgruntled if they must spend half their time dealing with data issues, instead of doing the fun stuff at work. Poor data quality also translates into lack of clarity on business performance. Without clarity, your team will not be on the same page. When bad data is polluting one team to the next, your organization will be overcome by mistrust between departments, instead of working together.
The impact of data issues becomes exponentially bigger as bad data travels downstream. This concept is captured by the “1-10-100 rule” developed by George Labovitz and Yu Sang Chang. If it costs $1 to prevent data issues at the entry/submission stage, it would take $10 to remediate data issues, and $100 to deal with data failures.
It is critical that your company takes data quality seriously. Your teams need to collaborate in establishing rules for identifying data issues. Consider leveraging a data platform that allows you to automate these rules to your data set. Ideally the platform would also allow you to triage bad data and resolve data issues in a repeatable manner. You will also need to build a data governance framework train your teams to understand the impact of data throughout its life cycle. The key is that this will need to be a collaborative effort and will require a data platform that empowers your teams.
Schedule a demo with the BaseCap Analytics team to learn about how to address your data quality issues. A data expert will help design a customized approach to turn your data into a competitive advantage for your company.