Some AI Decisions Negatively Impacting Lending?

By Ray Birch10.07.2019

As lenders begin to use artificial intelligence (AI) as part of the underwriting process, concerns are arising about decisions AI might make that could negatively impact borrowers and lenders.

The concerns have been strong enough to draw the attention of Congress, with a recent hearing by the House of Representatives Committee on Financial Services’ Artificial Intelligence Task Force looking into the question.

“When done right, artificial intelligence can mean innovative underwriting models that allow millions more people access to credit and financial services,” said Rep. Bill Foster (D-Ill), chair of the task force on artificial intelligence. But Foster said AI also raises key questions, such as, “How can we be sure that artificial intelligence credit underwriting models are not biased? Who is accountable if artificial intelligence algorithms are just a black box that nobody can explain when it makes a decision?”

The Risk to Lenders

It isn’t just the borrowers who might be at risk; so are lenders. Analysts have noted in cases where borrowers are wrongfully declined as a result of a software using artificial intelligence, the risk isn’t limited to just reputation—there is also the potential for lawsuits being filed.

“Further complicating the use of AI is that underlying bias is often present in these technologies, and unless credit union leaders ask the right questions and carefully evaluate the algorithms driving the technology and the social contexts in which the technology is used, they could unwittingly participate in furthering inequality in their communities,” stated Chandra Middleton, a PhD candidate at the University of California, Irvine, in a recent report on the topic from the Filene Research Institute.

Middleton pointed to algorithmic bias—how human biases become encoded in mathematics—that can be subconsciously embedded in artificial intelligence.

Limited Use–For Now

Leslie Parrish, senior analyst at Aite Group, said financial institutions may someday use AI extensively in the underwriting process, but for now most are using machine learning to determine the best ways to structure their underwriting models. She said a majority of lenders currently shy away from allowing AI to perform underwriting due to concerns about unpredictable or inconsistent decisions, and how those decisions might have to be explained.

“Yes, AI could be extremely useful to them for underwriting; a lot of folks at financial institutions have told us,” said Parrish. “But most of them said they're still not using it because of these exact concerns about the decisions AI can make. Even the fintechs say they are staying away for this reason.”

The Need for Speed

Nevertheless, the appeal of AI for loan underwriting is no secret: speed. It’s the same pressure financial institutions feel to move faster and faster on every aspect of a loan application.

What’s drawing lenders to AI for underwriting is speed, analysts said. The same need for speed that is driving financial institutions to move much more quickly on the application and approval processes.

“There's already some lenders that are using AI for underwriting, albeit quite carefully,” Parrish said. “They have figured out how to keep constantly updating AI and keep track of decisions being made, generating explainable reason codes why people are being turned down. So there are ways to do it.”

How AI is Now Being Used

For now, most lenders using AI in underwriting to populate their underwriting models.

“They are using AI to understand which attributes might be good to include in underwriting models,” said Parrish. “They're using machine learning to look at all kinds of data sources to see what they want to use, and what might be particularly predictive. They are doing things like using AI to weigh the inclusion of an alternative credit score in addition to the traditional credit score, things like that.”

A key concern in deploying AI to underwrite loans is the weight machine learning might arbitrarily assign to a lending directive or guideline. For example, Parrish said a machine might place too great an emphasis on a credit score—or some other aspect of a borrower’s history—leading to a disproportionate impact on a decision in which a person is turned down.

“There is a good deal of fear now among lenders on this potential disproportionate impact from AI in underwriting,” she said. “You can run into cases of protected classes being disproportionately turned down. Maybe that machine learning algorithm is combining different pieces of data together and the lender is not really on top of the AI underwriting model that's creating some sort of result they don't necessarily understand.”

Difficult Questions

As noted, if borrowers are wrongfully turned down for a loan by an AI decision, the lender could find itself in court, possibly without being able to provide much of an explanation for why the consumer was turned down beyond saying the decision was the result of machine learning. The CU could also find it difficult to answer examiners’ questions about lending decisions if the institution does not have a strong grasp of its AI underwriting model, Parrish said.

Ultimately, in the eyes of examiners and likely the courts, a person is behind all decisions made by AI, Parrish said.

“Whether the lender made the decision with their lending team or they gave decisioning rights to a machine, at the end of the day it's a person making the call,” she said. “The lender programmed the machine. I'm not a lawyer, but I assume you can't just pass the buck to the computer.”

A Good Match for CUs?

Is AI in underwriting a good match for credit unions, which rely a great deal on interaction with the borrower, often making exceptions due for personal circumstances?

Parrish said it could be.

“I think AI has positives and negatives for the borrower and the lender,” she said. “Machine learning could be very useful, for example, if an institution is trying to add more types of data into its underwriting model—maybe to extend beyond traditional credit scores and include other factors. I think machine learning could be able to help them figure out what those are and could potentially increase the number people they could reach to provide credit—even help in finding ways to offer more competitive rates and terms.”

“Careful and Thoughtful”

But to go further with AI and use it for making underwriting decisions requires lenders to be very careful and thoughtful, she said.

“Maybe going a bit slowly at first, using it for marketing purposes, before putting it in an underwriting model and using AI for more sophisticated things,” Parrish said.

No matter how far lenders go with AI for underwriting in the coming years, Parrish believes a human being will always be involved.

“I think it’s certainly possible we could have automated underwriting,” she said. “But there will always be a certain number of marginal calls, where the machine can’t make the final decision and the loan application will have to be reviewed by a person. There are a lot of borderline calls in lending, and that's where a person is always going to have to step in—almost no matter what.”

How Regulators Can Help

Parrish further believes regulators can help financial institutions by providing guidance on the use of AI in underwriting. She also believes a war for talent is brewing in this area, and smaller financial institutions may fall behind. 

“One of the biggest hurdles in AI being used by more financial institutions for underwriting is the internal technical capacity of the institution,” said Parrish. “If financial institutions are going to be using these new techniques, they need to either work with a third-party provider or develop something in-house. What I've been hearing is competition for this type of talent now is pretty serious. FIs are not known for being leaders in this area and a lot of the talented people are not going to the traditional financial sector, not the smaller community banks and credit unions. It’s getting very costly to hire and retain these people.”

Middleton, in her Filene report, stated there is a strong belief that credit unions will need to adopt AI-based solutions—not specifically for lending but to consider it for many operational areas—to keep up with industry competition, especially as other lenders cut costs. She said the biggest risk to the credit union system from AI is not adopting it, adding that by not using the technology “we will lose market share.”

Reprinted with permission from CUToday.info, a leading source of news and resources for credit union decision-makers.