Is Poor Data Quality Holding You Back?

October 17 2019

$3.1 trillion - IBM’s 2016 estimate of annual cost related to bad data in the US
75% - estimated total cost attributed to data issues
50% - estimated time knowledge workers waste dealing with data issues
60% - estimated time data scientists spend on cleaning and organizing data

Data Quality Issues

Ovum Research estimates that dealing with poor data quality is costing companies at least 30% of their revenue.

These numbers published by Harvard Business Review (HBR) highlight how inefficient companies are in using data to drive decisions. They also reveal a great opportunity for sound data quality as a source of competitive advantage for companies.

Reduced productivity and loss of customers are just a couple of the consequences of poor data quality that companies suffer from. The impact could be direct to the business or indirect by diverting resources manage bad data.

Let’s elaborate on how poor data quality reduces productivity. As the HBR articles noted, an inordinate amount of time and resource is spent by companies on dealing with data issues. Not only is your team operating at half speed, you also must consider budgeting for data consultants or employ data scientists. But it doesn’t just stop there. Data usually propagates from one department to the next even worse may reach down to customers. Instead of your employees focusing on growing your company, they are spending half the time toiling away in a “hidden data factory”, correcting each other’s errors.

The Hidden Data Factory

On the other hand, good data provides clarity into the market and your own company. You can leverage clean data as a competitive advantage to free up time to focus on improving your market position and establishing customer loyalty. You can also improve your processes and resource allocation by leveraging clean data. To put it simply, poor data quality leads to missed opportunities to grow and compete.

Poor data quality is not just about the operational inefficiencies and direct financial costs. As the hidden data factory concept indicates, if data issues are not identified and remediated effectively, they can harm your company’s reputation to customers, suppliers and partners. For example, bad customer data can lead to a misunderstanding of the customer’s needs.

Even if data issues don’t reach customers, poor data quality can brew internal mistrust. Your employees will become disgruntled if they must spend half their time dealing with data issues, instead of doing the fun stuff at work. Poor data quality also translates into lack of clarity on business performance. Without clarity, your team will not be on the same page. When bad data is polluting one team to the next, your organization will be overcome by mistrust between departments, instead of working together.

The impact of data issues becomes exponentially bigger as bad data travels downstream. This concept is captured by the “1-10-100 rule” developed by George Labovitz and Yu Sang Chang. If it costs $1 to prevent data issues at the entry/submission stage, it would take $10 to remediate data issues, and $100 to deal with data failures.

1-10-100 Rule of Data Quality Issues Management

It is critical that your company takes data quality seriously. Your teams need to collaborate in establishing rules for identifying data issues. Consider leveraging a data platform that allows you to automate these rules to your data set. Ideally the platform would also allow you to triage bad data and resolve data issues in a repeatable manner. You will also need to build a data governance framework train your teams to understand the impact of data throughout its life cycle. The key is that this will need to be a collaborative effort and will require a data platform that empowers your teams.




Schedule a demo with the BaseCap Analytics team to learn about how to address your data quality issues. A data expert will help design a customized approach to turn your data into a competitive advantage for your company.

Giving Back with BaseCap!

August 07 2018

On Thursday, August 2nd, 2018, BaseCap hosted a volunteer event with the Mutual Housing Association of NY (MHANY) to give back to the community. MHANY is a non-profit 501(c)(3) organization that owns and manages hundreds of affordable housing properties for low and moderate income families throughout New York.

We started the day off with the team bright and early!

Starting the day

We then took the subway to one of MHANY’s buildings in Brooklyn.

On the subway

Upon arrival, the MHANY team brought us down to the storage room to help organize the building’s files. We cleaned up, moved around shelving and boxes and did a lot of labeling and sorting!

Starting to volunteer

After two hours of hard work, the storage room was beginning to come together…

Progress in the storage room

After assembling some new shelving for the storage room, we took a lunch break in the rear yard of the building. We fueled up on delicious pizza, garlic knots, salad and lots of water along with excellent weather. Everyone was so hungry from working that no one took any pictures of the food!

After lunch, we split into two groups; one group stayed to finish the shelving and cleaning of the storage room, while the other group painted at another MHANY building a few train stops away in Brooklyn.

The shelving and cleaning group made tremendous progress and worked tirelessly to assemble all the shelves, move more boxes and tidy up the storage rooms. We hope our new labeling system will allow the MHANY team to efficiently search for documents that help them manage their buildings!

Shelving and Cleaning: Completed

Meanwhile, the painting group focused on painting the door, railings, windows and the front of the building. People passing by complimented the paint job and said how good the building looks! We had a blast!

Painting the door, railings, windos, and front of the building

After a long day of hard work, the team enjoyed drinks and BBQ to relax from the day’s efforts. Thank you MHANY for having us!

Post-Volunteering BBQ

Changes to the Regulatory Environment: Dodd-Frank Rollbacks

June 05 2018

On May 22nd, 2018, Congress approved the first major Dodd-Frank rollback to relax federal oversight for banks with less than $250 billion in assets. This revision brings the number of banks categorized as systemically important (SIFI) and subject to stricter federal oversight from 38 to be 10 or less. This change frees many mid-sized banks from tougher capital and liquidity requirements and from conducting stress tests. This provides community and regional banks with more flexibility to innovate and grow, and therefore, will spur more small business loans and consumer loans in the United States.

The bill was passed with bipartisan effort, with the House voting 258-159 to approve the regulatory rollback. Financial institutions have been performing extremely well prior to the rollback, with a report from the Federal Deposit Insurance Corporation (FDIC) stating that the combined net income of American commercial banks and savings institutions reached $56 billion in the first quarter of 2018, representing a 27.5% growth year over year.

In addition, the Federal Reserve announced on May 30th, 2018 a proposal to revise the Volcker Rule, which bans proprietary trading and prohibits banks from taking large stakes in hedge funds and private equity firms. The Federal Reserve, the SEC, the FDIC, the OCC, and the CFTC are in the process of approving the proposal to simplify the Volcker Rule so that banks have less difficulty knowing how to comply and so that it can be adequately enforced. Proprietary trading remains illegal, but the biggest proposed change is that banks will no longer have to specifically prove that each of their trades are used to hedge against specific risks and is not just a speculative bet. The proposal also applies different compliance standards based on trading assets size, with over $10 billion as “significant”, between $1 billion and $10 billion as “moderate” and under $1 billion as “limited.”

However, regulators are indicating that they will require big banks to self-regulate by creating new internal controls. Some have proposed to have the CEO of banks to personally attest to future claims that their institutions are adhering to the restriction on speculative betting. Self-regulation will present a different set of challenges for banks because processes and controls will be less standardized throughout the industry.

BaseCap Analytics and other regtech startups will need to play a key and central role in guiding financial institutions towards self-regulation. Our expertise in creating a balanced approach allows us to guide and support the risk committee and internal audit with their frameworks for risk tolerance and risk limits to ensure that they have adequate controls throughout the organization.