News & Insights

Case Study: Obtaining Data from Multiple Servicers – SBO.NET Integration

 

 

Mortgage lending firms have been experiencing massive growth. Additionally, many companies have been dealing with numerous mortgage servicers providing data from legacy populations. Whether growing or maintaining; ensuring that sustainable growth does not become hampered by data issues has become a significant concern.

Growing Pains

Dealing with a few disparate data schemas from a small number of servicers was possible by using Excel to generate daily reports. Managing the data pipeline did not take up a significant part of any team’s capacity. Some IT teams may have had enough domain knowledge to normalize small volumes of data to permit analysts to use more straightforward queries to generate reports.

Subsequently, as the number of servicers grew, it became more onerous to normalize the incoming data feeds. Servicers refused to adjust their reporting to match requirements and sent files in frequently changing layouts. Without normalization, analysts were required to invent increasingly complicated queries and conversion workbooks with each additional data source. Every month created tension for staff as servicers sent a brand-new file that had to be re-interpreted into the existing processes.

This is, however, not a sustainable approach. What had been a small aspect of staff functionality quickly grew to require full-time support. It became reasonable to consider assigning business analysts the task of addressing these data issues, but that would shift their focus away from their main function – analytics. Specialized data engineers could be hired, but the cost was both prohibitive and counter-productive to mortgage firms.

multiple servicers complication
Before: large number of servicers; no data quality control; manual processes requiring staff at various stages

 

A Real-World Scenario: Understanding the Problem and Strategizing a Solution

A prominent investment firm had been growing its mortgage business and consulted with BaseCap Analytics to address its growing data issues. Its portfolio had reached 7000+ loans, each with 600+ data points (columns), representing a rough loan volume of $1.66B.

Analysts at the investment firm objected that their functions were not scalable with the ever-increasing number of servicers. The additional data sources shifted the analysts’ time and energy towards data management – normalizing new data sources, cleaning up and validating the data, and revising queries in order to include new servicer data in daily reports. Their energy slowly shifted away from value added functions focused on analyzing data, performing liquidations, and servicing high-risk accounts.

Over time, the entire process became unwieldy – the data ingestion process required ongoing management, reporting queries became more convoluted, and internal knowledge transfer became excessively difficult when new hires attempted to orient themselves to a complex pipeline of data from ingestion down to actual analysis. Additionally, fixing the situation rapidly became more and more overwhelming – redesigning the data pipeline was like fixing the engine on a moving train.

These challenges were nothing new to BaseCap Analytics because the team had worked with other large financial institutions to improve their legacy data models to something more sustainable and efficient. These issues often evolved from years of dealing with data issues incrementally, and in a siloed fashion.

BaseCap Analytics has extensive experience developing, managing and correcting data pipelines to support any number of sources accurately populating client databases in a timely manner. This client’s needs required a combined approach of existing knowledge and creatively addressing the issues.

BaseCap Analytics strategized several changes to improve the firm’s data pipeline.

  1. The client had over twenty unique servicers reporting and remitting in different formats and with varying levels of data quality. They needed to not only ingest the data from each servicer, but to ensure the data was of quality and standardized without staff having to review files and correspond with the servicers. BaseCap Analytics’ Data Quality Platform was staged to manage every available data source as it is scalable to handle one or one thousand data sources immediately after setup. Moreover, the Data Quality Platform delivers clean, standardized data shortly after its receipt.

  2. Initially, the client determined a need to reduce the number of servicers. BaseCap Analytics and the Data Quality Platform became the resource for managing accurate data and loan transfer as the client moved loans from servicer to servicer, resulting in the client having two primary servicers (now making up ~97% of the portfolio) with extremely clean, standardized reporting & data from each.

  3. While the transfers were underway, the client was evaluating their servicer oversight vendor and determined that their spend was not in line with returns. They again turned to BaseCap Analytics to determine the best avenues for replacing the vendor. After evaluating multiple options, the client selected SBO.NET and engaged BaseCap Analytics to establish data setups and ongoing data feeds.

  4. After working closely with the client and the vendor to define requirements, BaseCap Analytics built a wide variety of unique pipelines and utilized the ETL engine to automatically deliver SBO.NET specific ingest-able files to be delivered on a daily, weekly and monthly basis.

  5. All of this ran through the Data Quality Platform to ensure consistent formatting, data quality, and timeliness of loaded data; resulting in freeing all staff from data quality reviews. Instead, they were able to attend to value-added functions, addressing source/servicing issues and collaborating with the IT department using the same Platform’s environment.

Ready to Scale

Working with BaseCap allowed the investment firm to utilize a data pipeline that is scalable with its roadmap. It now has a path to continually increasing its portfolio beyond the 7000+ loans it was already managing.

clean servicer data
After: consolidated servicers; automated data pipeline; end to end data quality control

 

This process revision allowed the client’s analysts to focus on growth instead of being distracted by data quality issues. With margin compression and a shrinking talent pool in the mortgage space, it was strategically important that the client be able to find a way to sustainably increase its loan volume while keeping the talent focused on their key competencies.

 How can BaseCap Analytics help?

  • If you are looking to expand your mortgage portfolio, you need a sustainable data pipeline.

  • BaseCap Analytics can help you grow your business without additional IT staff and hardware.

  • Schedule a demo with BaseCap Analytics, and a data expert will help design a customized approach to turn your data into a competitive advantage for your company.

TAGS

SHARE THIS ARTICLE