The Case For Scientific Servicing

0

The Case For Scientific Servicing REQUIRED READING: When you are making business decisions worth millions of dollars, you want the best information you can get. But it is not just the volume of information that is important – the source of the information and the manner in which it is used has become equally critical.

In today's mortgage business, modeling and automated decision-making – from automated underwriting to optimal outcome default decisioning – have become growing trends within the industry and within the servicing sector in particular. We are truly entering a new era of mortgage science, as the fine art of analytics continues to become more sophisticated out of necessity and demand due to the nature of the risks involved in the mortgage continuum.

From government-ruled delineation to repurchase decisioning, all operations in the mortgage space today must utilize modeling to succeed. Whether we are talking about a single loan or an entire portfolio, there is too much riding on the ability to make smart decisions to neglect using the best science available. This is particularly true for an industry that continues to face increasing regulations and media risk.

When one considers the sheer volume of information and diversity of sources that mortgage servicers must contend with, a scientific approach to mortgage servicing becomes a no-brainer. The tentacles of the mortgage process reach everywhere – including vendors, regulators, borrowers, attorneys, real estate agents and many others. The constant exchange of information and documents between these third parties represents an enormous amount of data being generated that would be impossible to understand without a sophisticated form of modeling in place.

The real issue for mortgage servicers is to get into a ‘big data’ frame of mind. Big data refers to the collection and handling of data sets that are so unfathomably large that they cannot be understood – let alone managed – by using traditional database management solutions. The size of data is actually a limitation when it comes to capturing, managing and analyzing it, so new solutions and techniques need to be created in order to break this process down to a shorter period and into a format that enables easier decision-making.

The mortgage industry is just starting to become aware of the ‘big data’ challenge; by and large, most lenders and servicers are not addressing it. Going forward, this is bound to change.

While data by itself is of absolutely no use at all, it becomes invaluable when properly utilized to maximize profits and mitigate risk. When placed under review, you can gauge the effectiveness and success of any operation or company based on its ability to collect, analyze and utilize data to manage its business.

Reporting is only one component when it comes to effectively using data. The other major component is using the right data variables for modeling throughout the business process.

The art of data collection is a key component to proper modeling. The richer the data that is inside the model, the sharper the intelligence it provides. Creating a model such as this requires selecting the right variables of data to build around.

Yet, selecting the critical data points that will drive decisions and actions is easier said than done. In order to do this right, companies need to rely on a blend of analytical, business and technological experience.Â

Origination loan-to-value, cash down at closing and payment history, combined with market data such as saturation, house price index (HPI), the current combined loan-to-value ratio and property condition, are excellent examples of some of the data points that will drive proper decisioning and outcome. But this is not all. When done properly, the addition of a borrower-specific questionnaire will determine the likelihood of a refinance, payment length on a loan modification or cooperation in a short sale. Valuable, hard data can be culled from borrower questionnaires that, when used in conjunction with other loan data points, add to the effectiveness of modeling.

Pick list questions

There is still more. In addition to the hard data points derived from borrower questionnaires, there often exists additional information about the borrower that is not yet known and does not exist in a numerical form. That is why it is important to incorporate ‘pick list’ questions for the borrower in the modeling process.

Pick list questions – which essentially allow users to pick their answers from a preset list of choices or values – are used to predict borrower behavior and designed specifically to avoid free-formed responses that have no consistency and are impossible to map. When it comes to borrowers' behavior and attachment to their home, it comes down to asking the right questions and properly training the staff on how to ask them. Using companies that help select juries or taking recordings from conversations of loss mitigation calls can add to the science of creating borrower data and ensuring a proper model.Â

A prime example of utilizing modeling to maximize profits and mitigate risk is loan segmentation modeling. This form of analysis has been underused in the default mortgage space, yet it can be extremely effective.

The process starts at the 60th day of delinquency, where the loan, borrower and property key data elements are mapped into a decisioning engine. The decisioning engine is also equipped with government, investor and state rules to ensure compliance on all loans in a consistent manner. The loans are run through the rules engine, which factors in items such as borrower willingness, attachment and ability, as well as other variables such as granular credit trends, property HPI and market saturation.

The output of the model flows into a distribution chain, which utilizes rules-based intelligence to route the loan based on the optimal outcome. The file is then delivered to the proper person or outsourced company based on skill-set, geography, loan size and severity. Depending on the appropriate optimal outcome selected, the loan could be routed to refinance, loan modification, foreclosure, short sale, deed in lieu or special handling teams.

By conducting the process on the 60th day, mortgagees are able to reconcile their entire delinquent loan portfolios and ensure consistent treatment of all loans among staff and outsource vendors. This is particularly important for companies that are asked to demonstrate compliance with a growing number of new regulations, such as anti-dual tracking rules being created as a result of both the national servicing settlement and by individual state regulations. In addition to modeling and distribution, technology can provide a snapshot of all decisions, data and documents to establish a complete audit trail if needed in the future.

To ensure a single point of contact, the loss mitigator is tagged in the distribution chain to make sure he or she continues the relationship with the borrower. When final outcomes are completed, the data is then used to further define the models to become even more predictable.

As the industry faces more rules and new risks, the need for additional science and ‘big data’ solutions in the mortgage industry becomes even more important. Just as automated underwriting did for the origination process, the innovations in modeling happening in the servicing business today will have a lasting impact for years to come. As margins become tighter and the quest for compliance becomes even more prevalent, the cost of not having such tools becomes too large to ignore.

John Vella is chief operating officer of Los Angeles-based Equator LLC. He can be reached at john.vella@equator.com.

Subscribe
Notify of
guest
0 Comments
newest
oldest most voted
Inline Feedbacks
View all comments