News & Insights

Company Updates

Avoid Data Errors Amidst Record Breaking Origination Volume


Data technology sets the stage for organizations to identify issues before they become more costly down the line.

Recently published data shows the largest refinance volume in history. Over $1.1 trillion in first lien mortgage refinances in Q2 2020. What’s more, trends indicate that Q3 2020 purchase lending will rise by 30-40% which would push to another record high. With the high speed of turnover in servicer and investor transfers, the data being boarded onto your systems will barely have time for a true QC before they are released to another entity. Similarly, the new loans coming onto your books will have barely boarded their initial servicing system before RESPA notifications fly out the door and they board your system. How accurate can the data be? How many were reviewed? One in ten? Five in a hundred? There are errors in your data. But how can you find them? Which are crucial to be corrected right now?

Several years ago, large mortgage banks acquired something in the range of sixty thousand loans each month. Before they had even been boarded to the servicing system, they had been sold to another investor. The loans were to be interim serviced for 40 days. QC departments consisted of ten to twenty people. Their approach was to review a sampling of loans – say ten – per deal (each pool held roughly six thousand loans). If they found an error, they would look at another ten loans. Even when errors were identified, there was little recourse as the loans were already sold. Or, if the loans were found to be wanting, they were pulled from the pools and bundled as “scratch and dent” which then were sold for pennies on the dollar to get them out of portfolio.

How does your firm compare?

How many loans get a QC that ensures every data point boarded is accurate? How many loans have boarded this year alone with incorrect arm information or faulty origination elements?

It’s no secret that bad data creates an inability for organizations to trust their decisions. Worse, it forces firms to continually repurchase loans, creating a difficult position for representing quality to their clients and investors.

At BaseCap, we can get a data valuation platform in place day one with rule validations to determine the quality of what’s coming in the door. What’s more, you have the freedom to create any set of rules to determine whether the data on each field stands up to the level of scrutiny you require. From one field to thousands of data sets, the BaseCap platform has the flexibility to review as much data as is needed every day, delivering score cards and cleaned data to whatever application or database you require.

You’re getting a tremendous amount of data with each round of new loans. You understandably have questions about its quality. Let us help support the accuracy of the heart blood of your business – your data. Contact us today!