News & Insights

Solving Loan Professionals’ Data Challenges with Checklist Automation

Worker surrounded by post-it notes

May 18, 2023

Craig Riddell, Executive Director of Sales and Account Management at BaseCap Analytics, was interviewed about the data challenges facing the mortgage industry, and the solution that will help loan professionals accomplish more in less time.

Tell me about the typical process that mortgage banks use to validate their loan data.

There are two worlds in that answer. In one, there’s a process checklist. As files work their way through, there are workflow statuses that are checkpoints. Some are embedded in tools like a DMS or an OS system. And those are used from origination to processing, underwriting, closing, funding, and onward through servicing.

And then you’ve got the oversight checklists for regulatory compliance and quality control. You may have state and federal agencies depending on the type of files you’re originating. Some of that may go away with more and more automation because there’s some certainty that can be embedded into the process. I don’t think we’re there yet, but it’s getting better.

The oversight scrutiny will never go away, which is good, right? It’s necessary. Now, it’s just a matter of: how will the industry continue to best deliver accurate feedback to those agencies that are there to enforce the performance?

There’s an old joke that there’s just one massive checklist that everybody’s been following for years. And it just gets bigger and bigger, and nobody ever takes anything off that checklist. Now, people use a mouse instead of a pencil, but the work itself still has to be evaluated.

You mentioned that uncertainty is constant, and regulatory oversight has always been a constant for these banks. What other pain points are they experiencing with the typical checklist process?

Regardless of whether it’s oversight, a workflow, or a process, the speed at which they can modify that behavior is always apparent. How many people in your organization have the authority to make those changes? If it’s not self-service and it’s through a vendor, how quickly can they get something done and tested and back into production?

The nuance in these tools is another critical area. There used to be just a couple of loan products. You now have many, many loan products. Less than there were pre-economic crash, but more products and nuance within the consumer as to how they generate their income and document it. So, all of that needs to be accounted for. And those changes, which are ideally made in real-time, are a pressure point.

Additionally, there’s this challenge to run checkpoints, but they’re always at a single point in time, right? It’s a milestone record. But the moment after that checklist is evaluated or that checkpoint is completed, something could go sideways. And it won’t be recognized for hours, days, weeks, or even longer because certain evaluations are run incrementally. That’s a pain point in the industry because it requires constant observation.

Although there’s press out there that says tools exist for that, there are very few of them, and they have some obstacles to their deployment. 

“The moment after that checklist is evaluated, or that checkpoint is completed, something could go sideways, and it won’t be recognized for hours, days, weeks, or even longer because certain evaluations are run incrementally.”

Craig Riddell

It’s clear that based on these pain points, banks need to modernize their process—to go faster and be able to validate more data. What are some of the market forces that are driving that? What’s going to push them to go forward with that digital transformation?

At the end of the day, it’s almost always the customer, right? The consumer and their access to information and their desire for real-time results. There’s a world out there that expects things to happen with speed and efficiency and accuracy, and trustworthiness. And that’s delivered in many cases.

But it’s hard to ask of a large real estate transaction. So, that reality of all the things that need to be done properly but done quickly and with transparency to the consumer is ultimately what everybody’s trying to accomplish.

That scrutiny gets deeper and deeper as more data is available and more data is captured. So that’s another force: competition. Technology pushing itself upon the industry is a good thing, but it creates challenges. It’s intended to create greater efficiency across the board and reduce costs, but it creates projects. Not all technology plays nice with other technologies, even though it’s an API world, and you would think that best-of-breed applications will win out. They’re just not all accepted by some of the bigger competitors in the market, which can stall adoption. And that has an impact on everyone.

Ultimately, those would be the drivers: the consumer, the customer, the regulators, and then technology (in a good way) applying pressure. But some cracks develop when you apply pressure too fast.

Another challenge facing the mortgage industry is the increased cost of servicing loans. There’s recent evidence that the average cost has skyrocketed to more than $11,000. What do you think is driving that particular number? Why are loans getting more expensive to service?

I don’t want to say it’s comical, but the price increasing over and over again over the last ten years is a bit of a head scratch to many, particularly outside the industry. It’s just a little bit outrageous.

I think some of what we’ve talked about so far has to do with this desire to accommodate all the new technology as it comes along. If you want to always be on the cutting edge, that cost comes with it. Integration and customization expenses have to get paid for somewhere. So, while you’re chasing efficiency and automation on one side, you burden yourself with increased expenses elsewhere. That cost may be found in consulting, or it could be a foundational expense to adequately oversee all the technology that’s now a part of the process.

The pandemic brought to light many gaps in the process. There’s so much data that is not properly recorded and accounted for in some of the prevailing systems that oversee the majority of use, right? I think every time we think we’ve solved for X, we find that there’s a gap somewhere else. So, all of that still has higher-priced resources trying to solve some of these problems.

What are some of the unique ways the BaseCap solution addresses the pain points of the increasing cost to service loans and the technical complexity of handling all the different steps?

One of the real wins for BaseCap is that you can look at all data in all loans all the time.

It is total observation—total auditability of what’s been found, and the findings are extremely nuanced and articulate and can be routed to the appropriate subject matter expert in your organization to resolve and identify whether it’s truly an error or just a typographical mistake, or is it something that is a showstopper?

“One of the real wins for BaseCap is you can look at all data in all loans all the time.”

Craig Riddell

Compare it to a sampling methodology that the industry has always used. Number one, it’s what was required by the regulators. But because of the nuance, sampling is becoming unreliable. There are too many discrepancies that are possible across the portfolio, and you’re only looking at 10% for the examination involved.

Now that there are tools to look at 100% of the data, all the time, from application through to pay off, that’s really where I think BaseCap has created a mousetrap that is a game changer.

To learn more about BaseCap’s solution to expensive and time-consuming data quality control processes, visit Or contact Craig at