Taking A New Approach To RMBS Valuation

0

Taking A New Approach To RMBS Valuation REQUIRED READING: Following the global financial crisis, residential mortgage-backed securities (RMBS) market participants have paid more attention to developing better processes to evaluate how these securities will perform over time. New tools are allowing a more sophisticated, in-depth view of the underlying collateral that backs these securities.

At the same time, the market has seen significant advances in the models used to predict mortgage behavior – including the incorporation of updated payment behaviors, enhanced housing-price indices and macroeconomic data. These functions have been fully integrated, providing a complete workflow that generates more accurate valuations on individual bonds than at any time in the past.

Data and analytics firms are seeking to produce valuations and in-depth credit analysis on much larger swathes of the market, in a faster time frame. Their focus is based on four key areas: aggregating and analyzing collateral data; generating cashflow vectors; integrating predictive models into capital structures to produce bond-level cashflows; and arriving at a price by utilizing market color and relevant discount rates.

The first key area involves implementing a database that stores information on collateral, bonds and deals. This data is used to drive downstream models that predict loan-level cashflows. Individual payment streams are translated into aggregated trust payments that are used to estimate future principal-and-interest distributions to bond holders.

The workflow should begin with three basic categories of information: attributes describing the trusts; information that outlines the mortgage characteristics at trust issuance and periodic monthly cashflows, including scheduled and unscheduled mortgage payments. This is an area that has seen vast improvement in recent years, as the quality of loan-level data has dramatically improved with better data-aggregation processes.

That data also is being delivered faster. Investors have more options for comprehensive loan-level data sets than they did prior to the 2008 financial crisis. There are now data sets that include comprehensive data on loss severities, loan modifications and other ancillary delinquency attributes covering a broad spectrum of the nonconforming RMBS market. Collateral data sets also are fully integrated with outside data sets, including housing prices and credit scores, allowing richer analysis.

Within the RMBS valuation workflow, there are three primary uses of the loan-level information: to display information on historic loan behavior, allowing insight into individual bond transactions; to categorize historic loan payment behaviors, driving predictions on future cashflows; and to provide a feedback loop between the retrospective and prospective views of periodic loan payments.

To make the best use of the massive amounts of loan information, database and statistical tools are used to stratify similar loan types and correlate individual attributes with specific historic loan performance to derive predictions on future payment behaviors. Front-end tools produce graphic representations of projected cashflows to display loan-level performance. Predictive models are then used to create a time series containing performance projections that outline the specific timing of future cashflows and/or losses. Full-time series views of these estimates are generally referred to as collateral cashflow vectors.

There are several third-party vendors that sell models that predict unscheduled cashflows produced by events such as defaults, prepayments, delinquencies and loss severities resulting from sales out of real estate owned properties and loan modifications. Many investors augment third-party models by using proprietary estimates generated with in-house applications. Many of these models combine collateral-specific attributes with actual performance statistics and economic projections to derive their predictive cashflow vectors.

Bond-level cashflow

Collateral cashflow vectors are then run through deal cashflow waterfall models that represent the capital structures. These models are used to generate predictive cashflows at the bond level. In many cases, collateral vectors are grouped into sets of like loans (sometimes referred to as ‘collateral groups’). These group designations can be defined either by the capital structure itself or by the bond modeler.

The final step in the mark-to-model valuation workflow is to use the projected bond-level cashflows to produce prices and estimated bond yields. This step often involves the participation of a trading desk or a set of portfolio managers with intimate familiarity with both the RMBS markets generally and the performance of individual bonds, trusts, shelves and market sectors. The desk adds market color and can estimate spreads between bond types and recommend rates to discount future cashflows to determine the present values of specific securities.

Intrinsic valuations can be obtained by discounting the bond cashflows at their relevant coupon rate without market color, which removes trading sentiment from the valuation process and provides a consistent method for tracking value over time.

To ensure the viability and broad acceptance of the workflow, there should be a comprehensive and auditable output at each step. This allows a downstream client to mark cashflows to market and better explain and defend calculations to various internal and external stakeholders. The results of each step would be captured in a back-end database, providing a framework for additional downstream analytics preformed by either the valuation team or their clients.

In order to properly evaluate an RMBS bond, investors must have the ability to test the bond under multiple scenarios. Automation of loan data processing and the integration of analytical models, macroeconomic inputs and waterfalls allows for this type of analysis to take place with maximum efficiency. This is especially valuable when running a Monte Carlo simulation analysis, in which inputs such as House Price Index and interest rates are varied around baseline projections based on assumptions of volatility and correlations.

Users can readily produce multiple collateral vectors, each of which can be propagated through the waterfall models to create probability distributions of bond values. This generates outputs such as value at risk and other measures of tail risk. By doing this, the user not only knows the most likely valuation, but also gets a sense of how stable the valuation is in the presence of external market events.

The automation of this complex workflow involves overcoming many operational barriers, including the storage of interim vectors and other results to be utilized by the subsequent downstream process used later in the workflow. In order to create bulk loan-level vectors, valuation teams use a custom front-end manipulation tool to generate visual output using the loan-level data that would be used to drive internal models. The resulting calculations would then be stored as predictive vectors in the mortgage database.

Stored vectors have two main uses. The first is to allow data users to calibrate the model by comparing predictions to actual loan performance. Part of the automation of this function would include generating graphics comparing cashflow predictions as reflected by the stored vectors with monthly cashflows reported by the bond trustees. This allows valuation teams to recalibrate models, and it also provides downstream clients with a built-in credit surveillance audit report.

The second use is to help valuation clients understand the existing vector assumptions and enable them to adjust inputs to customize vectors according to their understanding of an individual deal and/or the underlying collateral. To that end, a data-manipulation application is used to display loans backing individual bonds and easily stratify loan inventories to associate loan characteristics with the predicted payment behaviors.

This allows the client to add value to the collateral groupings by customizing predictive cashflows generated by the workflow and enables both the valuation staff and their clients to review and adjust the vectors generated by the default and prepayment models.

Larry Barnett is CEO of Denver-based BlackBox Logic. He can be reached at lbarnett@bbxlogic.com. Bill Hunt is vice president in the Global Markets Analytics group of Opera Solutions, headquartered in New York. He can be reached at bhunt@operasolutions.com.

Subscribe
Notify of
guest
0 Comments
newest
oldest most voted
Inline Feedbacks
View all comments