The Appraisal Report: Last Bastion For ‘Unstructured Data’ In Mortgage

3

In case you didn’t notice, the mortgage process is becoming increasingly standardized as time goes on. Not only did the Great Recession significantly reduce the variety of mortgage products available on the market, but new rules related to underwriting and loan quality, including the Consumer Financial Protection Bureau’s ability-to-repay/qualified mortgage rules, have resulted in lenders taking a production-line approach to loan origination. This “factory-like” approach has done much to standardize the process.

Facilitating this assembly-line approach are today’s loan origination systems and various ancillary systems, which, in essence, are the “machines, pulleys and conveyors” that make the mortgage “factory” run. Also facilitating this new approach is e-document/e-signature technology, which aims to eliminate the manual paper-based processes that lenders have been using since the beginning.

The interesting thing is that the mortgage factories of today are increasingly using just the data that is contained in the mortgage documents – as opposed to the documents themselves. Because mortgage is moving to an all-digital process, the documents are becoming less important than the actual data – and the data has become like the fuel that makes the mortgage factory run.

As such, there has been a huge push during the past eight years for the industry to move beyond manual, paper-based processes and start embracing data as the new way of doing things. In some cases, that means replacing the “unstructured data” that was used in the traditional, paper-based processes of the past with “structured” data that can easily be plugged into today’s systems. This use of structured data not only makes it much simpler to automate the mortgage process and improve loan quality, but also enables better, faster analysis of key trends occurring internally and across the entire industry.

There remains, however, a major element in the mortgage process where unstructured data continues to play a huge role: the home appraisal. Although today’s appraisal reports include considerable hard data that can be neatly plugged into the process, the main part of any appraisal report continues to be the “narrative,” which is written by the appraiser in the field. Though it is possible to use optical character recognition to extract certain data from these narratives, there is a question as to how this push for more structured data will impact appraisal reports in the future.

Also raising questions about how appraisal reports might change is the increasing demand for more data and transparency from regulators, government-sponsored enterprises (GSEs) Fannie Mae and Freddie Mac, the Federal Housing Administration (FHA), and investors. The GSEs and the FHA, in particular, have already done much to standardize the process and put the emphasis on hard data through the launch of their new appraisal submission and review portals, new quality control (QC) tools, and other appraisal-related technology initiatives.

To learn more about how this push for more structured data and transparency in the mortgage process could impact the appraisal industry, and possibly change appraisal reports in the future, MortgageOrb recently interviewed Jeff Bradford, CEO of Bradford Technologies; Phil Huff, president and CEO of Platinum Data Solutions; Chuck Rumfola, senior vice president in strategic initiatives, for Veros; and Greg Stephens, chief appraiser and senior vice president of compliance for Metro-West Appraisal Co.

Q: Are new regulations and new requirements from the agencies (GSEs/FHA), along with their technology initiatives, resulting in a need for more structured data in appraisal reports? Will these new requirements result in a need to expand the number of data fields in the reports? How might this affect appraisal work in the field and in the office? Can you provide some examples of new pieces of hard data that appraisers might be asked to include in their reports in the future that they don’t provide today?

Bradford: Yes, these new regulations and requirements are requiring more structured data to be contained in the appraisal. Yes, again. The Uniform Appraisal Dataset (UAD) was a good example where the number of data points needed to be captured was greater than the number of fields on the report. So, data on the report had to be abbreviated, causing a loss of transparency for the borrower. As the systems get better at analyzing data, they will uncover areas where additional data is required. Recently, more standardized information was required on garages, carports, etc.

Huff: Yes, the agencies’ new regulations and requirements, along with their technology initiatives, have resulted in a need for more structured data in appraisal reports. In general, we’re seeing a trend and movement away from less structured data to more structured data. For example, non-structured PDFs were standard for appraisals, but now, the industry has transitioned to structured XML format.

I’m not sure that it will create the need to expand the number of data fields in appraisal reports as much as it will call for a certain type of data. Appraisals will become less about the appraiser’s subjective viewpoint – and more about actionable data. If the industry wants to take advantage of the progress in the tech world, appraisals will comprise less information and more data. Machines read data. Information is for human consumption.

This transition to actionable data will likely impact appraisal work with the proliferation of tablet and smartphone use in the field and greater analytics in the office. We can also expect to see applications that leverage advances, such as geolocation software, which can speed up the process.

I think the appraisal industry will evolve the appraisal report to include additional hard data, such as measurements taken with smartphones and photos that are immediately uploaded through a secure application that indicates precisely where and when that photo was taken. We’ll probably also start seeing more hard data that helps lenders assess the quality of the appraisal, such as data on the appraiser’s discovery process and the model analysis he or she used, candidate comps, or MLS data.

Rumfola: From Veros’ vantage point, we see the focus is not on the appraisal forms right now. The focus is on closing data – specifically, the Uniform Closing Dataset (UCD), which falls under the Uniform Mortgage Data Program (UMDP) along with UAD.

When UAD was initially developed, it was understood this effort would need to be revisited at some point, but the work done jointly by Fannie Mae and Freddie Mac is solid and has been the foundation successfully adopted by numerous other entities. We would envision UAD will be reopened at some point in time, but presently, the agencies can shift their focus to other important areas.

Collecting closing data is at the forefront, and behind that is the loan application data (the Uniform Loan Application Dataset). Just as the agencies focused on standardizing, then collecting, then analyzing the appraisal data, the industry should expect the same progression for these and other important datasets. These will be efforts that require focus more from the seller/servicers and their supporting technology providers as opposed to a focus for the appraisers or appraisal technology firms.

Stephens: I am convinced the deployment of the GSEs’ Uniform Appraisal Data Portal (UADP) in 2011 set in motion a new paradigm within the valuation industry. Between 2011 and 2015 Fannie Mae reports that it received more than 14 million electronic appraisal reports through its data portal. Utilizing that data, in addition to other third party data, Fannie Mae developed Collateral Underwriter and began making it available to its correspondent lenders. With the access to this new technology and the data provided by the GSEs, for the first time in the history of appraising, the users of appraisal services now have access to as much – if not more – data than the appraisers completing the assignments. Using the data available in CU, those users are questioning the contents and analysis within the appraisal reports.

In response, appraisers are, by necessity, expanding the use of technology programs that aggregate market data and incorporate trending charts, graphs and regression tables to produce more meaningful reports. Certain software programs let the appraiser add sales data to the addenda that was considered but not used in arriving at an appraised value. This proactive display of data is used to reduce the number of requests from lenders using the CU software that now have access to the area sales that are automatically displayed as part of the CU output.

Due to the limitations of the data fields within the GSE appraisal forms, the additional data is being included as supplemental addenda. There are no indications from the GSEs that the appraisal forms, last updated in 2005, will be revised in the near future – especially seeing as the millions of data bits being utilized to deploy CU results are directly linked to the data fields in the 2005 version of the GSE forms.

Q: Why are investors demanding more structured data and increased transparency in the appraisal process? Could this, too, result in increased data fields in the reports and an expansion of the basic structured data? How might investor demand for more data and increased transparency change the reports in the future?

Bradford: I believe everyone is demanding more transparency in the appraisal process. Today, it’s easy to second-guess the appraiser’s conclusion, so more information is needed to support the conclusion, including how comparables were selected, how adjustments were determined and the reconciliation thought process to arrive at the final value conclusion.

Huff: Investors are definitely demanding more structured data and increased transparency in the appraisal process. This is because they are keenly aware of two things: the importance of collateral when it comes to mortgage risk and the difficulty in achieving consistency and accuracy in “QC-ing” appraisals. This could have an impact on the data fields in appraisal reports, and the industry could very well respond by restructuring appraisal reports to include the data that investors are seeking. For example, investors want to know why candidate comps weren’t used, so appraisal reports could be restructured to include data that supports their needs.

Investors know that they will benefit from more data. As a result, as an industry, we’re moving toward greater and greater transparency. Investors, as well as most of the industry, understand that the appraisal process is not an exact science but must be supported by good data. They’ve learned from our experiences in 2007 and 2008. We can trust, but we must verify. We know that investors are going to request greater assurances, so as an industry, we need to take the steps to provide the data, QC and transparency they need.

Rumfola: Yes, but the investors that are focused on this are not the GSEs – it is coming from the private-label securities (PLS) market. The GSEs have already changed their policies and collect the data for improved risk management. We’re working with the PLS market to educate them so they can adopt some of the same practices. These investors are focused on more transparency into the underlying values, which would give them the ability to benefit from appraisal data and better manage their risk profile. This model has proven to be successful for the GSEs and will be a critical component of a PLS market return.

There are some important efforts around this focus right now. One of which is the development of the Structured Finance Industry Group’s (SFIG) 3.0 “green paper,” which aims to provide more specifics and best practices related to reviving the PLS market. According to SFIG’s 3.0 Green Paper regarding appraisals, “The appraisal complies, in both form and substance, to the Fannie Mae or Freddie Mac standards for mortgage loans of the same type as the mortgage loans and Uniform Standards of Professional Appraisal Practice standards and satisfies applicable legal and regulatory requirements.”

Stephens: At the two-day American Enterprise Institute/Collateral Risk Network conference in Washington, D.C., last fall, several investors reported the need for more meaningful data from appraisers than gridding three, six or nine comparable sales and listings on a grid and arriving at a point value. The need for more meaningful analysis of economic supply and demand and area market trends was emphasized and will become significantly more relevant as the various markets around the country experiencing an economic boom begin to stabilize and, in some instances, experience over-supply and the beginning of a downward-trending market. As mentioned in the response to the first question, the limitations of the GSE form data fields necessitate the utilization of expanded addenda. Fortunately, there are numerous software programs currently available to appraisers that enable the display of robust amounts of data to support and augment the analysis developed within the appraisals.

Q: Is it safe to say that some of this demand for structured data is being facilitated by technology (i.e., the portals and QC tools)? As technology advances and becomes a bigger part of the appraisal process, will it fuel even greater demand for collection of hard data? Does technology make it easier for an agency to say, “Hey, let’s ask them to collect this, and this, and this, and this … ”?

Bradford: It’s all being driven by technology and the fact that the big banks process so many loans, making it almost impossible for appraisals to be manually reviewed with any type of efficiency. Standardizing the data and the process makes the whole collateral valuation process much more efficient.

Huff: Technology is absolutely facilitating the demand for structured data. There’s no way we’d be able to provide this level of scrutiny without it. I’m not sure that technology itself will fuel a greater demand for collection of hard data. Rather, it will make data collection easier, and it will provide the vehicle for delivering that data. I’m not going to say that agencies will only demand the bare minimum of what they need. However, I don’t see them asking for additional data unless they believe that data will help them transact better, higher-quality, less risky loans either in the present or in the future.

Rumfola: Bear in mind first that appraisal forms already had a “standard” – in the use of standard forms such as the 1004. The issue addressed by technology was (a) the lack of standardization around the use of those forms (e.g., rating the condition of a property), (b) the inability to collect the data or enforce any uniformity, and most importantly, (c) the inability to analyze the data in a proactive approach that could minimize risk.

From where we stand today, the availability of this appraisal data for risk management is having a tremendous impact, and we should expect the ability of lenders and the agencies to collect and analyze data to continue to impact QC applications. Fannie Mae’s CU and Freddie Mac’s Loan Coverage Advisor are proof of the agencies movement in this direction. Similarly, technology tools like those that score appraisals or manage a fully digital appraisal workflow are also evidence of this on the lender’s side. The structured data mentality is allowing the mortgage industry to standardize, collect, analyze and evolve.

As to whether this foundation makes it easier for a stakeholder to ask for data point after data point, this is not something we see as a mounting concern. First and foremost, investors are not showing any indication of “collection for collection’s sake.” But more practically, you have to consider the path that any new data point (be it one or many) has to travel down in order to be implemented. In the appraisal space, any new standardized data point has to make its way into the appraisal form; work must be done by forms vendors; and appraisers need to update their software and understand the new data point. Lenders then need to ensure they can collect the data point and apply it where needed, and tech vendors must update their integrations to pass the data into the portal. Even for one data point, there is a high-touch factor due to the way the Uniform Collateral Data Portal has connected the entire workflow. If or when there are new data points, they are going to be added in a practical manner.

Stephens: There can be no doubt the portals and QC tools are definitely driving the demand for more structured data. Providers of the technology-enhanced programs report a significant increase in appraiser subscriptions and utilization since the deployment of CU by Fannie Mae, followed by Freddie Mac adoption, the Veterans Affairs adoption of CoreLogic’s QC tool and the soon-to-be-deployed electronic data portal by the FHA. There can also be no doubt that as more data is incorporated into appraisal reports, appraisers will need to provide more meaningful analysis of the data. History has shown us that as more data becomes available, it requires more analysis to make the data meaningful to the users.

Q: What role could/does technology play in standardizing the appraisal process?

Bradford: Technology plays a huge role in standardizing the appraisal process. In particular, technology plays a big role in the standardization of the data in the appraisal. With standardization, systems can be built to review, analyze and validate the appraisal, adding additional reliability and reducing the overall risk.

Huff: Data is the key to standardizing the appraisal process, and technology is the facilitator. Technology helps us create a standardized level of QC. It standardizes the process and the way data is accessed and presented. When a lender or investor opens an appraisal, all of the QC information should be available up front – we have the capacity to do that today.

There’s practically no limit to the impact technology can have on the appraisal process. Tools such as geolocation software can essentially transition the appraisal ordering process into an Uber-like model, where lenders can see which appraisers are in the vicinity of a property – just like we can see how many Uber drivers are in our area – send out an order, and appraisers either accept or decline the assignment.

Rumfola: Let’s separate “standardization” into two areas. First are the actual data points and their meaning, and second is the process behind the transmission of the data. Technology really plays a role in the latter area. What technology effectively does, in this capacity, is it allows for two-way communication to take place (between lender and GSE). Technology allows the lender to receive feedback on whether it’s investor believes they are adhering to the standard in a very black-and-white manner. Technology allows a consistent, real-time flow of messaging back to the lender, resulting in more proactive dialogue related to the loan’s valuation data. With the improved dataflow, proactive communication and enhanced risk management, the GSEs have been able to modify their rep and warrant framework and improve the business model related to loan securitization on both sides of the fence.

Stephens: With the deployment of the GSEs UADP in 2011, technology has been a primary driver of the standardization of the appraisal process by way of the data field standardization requirements of the GSEs. The introduction of standard codes that must be adhered to before an appraisal report can be delivered electronically forces appraisers to comply.

Q: Are we moving toward a day when the narrative in an appraisal report will take a back seat to hard, quantifiable data?

Bradford: We are moving closer, but I don’t believe that day will ever arrive. There is a limit to the factual, standardized data points that can describe a property. Every property is unique in some respect, and this will require some narrative to describe and account for in the valuation.

Huff: There will always be a need for local expertise and physical evaluation of a property. That said, the industry in general is moving toward incorporating more hard, quantifiable data into the mortgage decision. There’s more data than ever to use as part of that decision today, as well. Technology should be a human enhancement, not a human replacement.

Stephens: The relevance of narrative content within appraisal reports is in direct relation to the analysis and conclusions supporting the data in the report and the addenda. Other narrative content within appraisal reports will continue to trend to significantly lesser relevance.

Subscribe
Notify of
guest
3 Comments
newest
oldest most voted
Inline Feedbacks
View all comments
Ron
Ron
8 years ago

How wonderful for the lending industry, assembly line production. They complain about the narrative part of the report. What their “mechanical data” doesn’t have is one of the most important parts, The eyes and ears of the Appraiser, and his analysis and judgement.

Another thing is the Character of the Buyer. People are individuals with individual strengths and weaknesses. Past performance or affidavits about a buyer’s character tell a lot. Some may try harder than others to maintain a family and a home.

Al
Al
8 years ago

Quote from the article: “With the access to this new technology and the data provided by the GSEs, for the first time in the history of appraising, the users of appraisal services now have access to as much (if not more) data than the appraisers completing the assignments. Using the data available in CU, those users are questioning the contents and analysis within the appraisal reports.” This says it all. The GSE’s and Lenders are refusing to make their data available to the Appraiser, then using that data to question an appraisal. The Appraiser’s analysis is based on the available… Read more »

Kevin Goodale
Kevin Goodale
8 years ago

To summarize, millions of appraisal reports are being vetted for the government to share with the lenders to pin point appraisal inaccuracies for ‘Quality Control’; and yet this information is not shared with the appraisers. Sounds like a game of ‘gotcha’ to me.