UK regulators warn banks on use of AI in loan applications

Investing

UK financial regulators have warned banks looking to use artificial intelligence to approve loan applications that they can only deploy the technology if they can prove it will not worsen discrimination against minorities, who already struggle to borrow.

The watchdogs are increasingly pressing Britain’s biggest banks about the safeguards they are planning to put in place concerning the use of AI, according to several people familiar with the talks.

High street banks are exploring ways to automate more of their lending, including the use of AI and more advanced algorithms, to decide who to lend to based on historical data held on different types of borrowers, who can be grouped by categories such as postcodes and employment profiles.

Banks believe using machine learning techniques to make lending decisions could reduce discrimination against ethnic groups who have historically struggled to access reasonably priced loans. They feel AI would not make the same subjective and unfair judgments as humans.

“The banks would quite like to get rid of the human decision maker because they perceive, I think correctly, that is the potential source of bias,” said Simon Gleeson, a lawyer at Clifford Chance.

But the regulators and campaign groups fear that use of AI in credit models could have the opposite effect. “If somebody is in a group which is already discriminated against, they will tend to often live in a postcode where there are other (similar) people . . . but living in that postcode doesn’t actually make you any more or less likely to default on your loan,” said Sara Williams, of Debt Camel, a personal finance blog.

“The more you spread the big data around, the more you’re going after data which is not directly relevant to the person. There’s a real risk of perpetuating stereotypes here.”

James Daley, founder of advocacy group Fairer Finance, said there were already concerns about the way data was used to price both credit and insurance because it “isolates the most vulnerable” by offering the same high pricing that those types of customers have traditionally received.

This leads to a cycle where those in groups who have traditionally had high defaults are charged higher interest rates, which in turn makes them more likely to default. “The idea that you add machine learning into that makes the whole cocktail even worse,” Daley said.

Last year, the chairs of two US congressional committees urged regulators to ensure the country’s biggest lenders implemented safeguards to ensure AI improved access to credit for low and middle-income families and people of colour, rather than amplifying historic biases.

In their submission on regulating digital finance, the EU’s financial regulators last week called on lawmakers to consider “further analysing the use of data in AI/Machine Learning models and potential bias leading to discrimination and exclusion”.

Banks in the UK were cleared of racism in loan decisions by a government review almost a decade ago but were still found to be lending less to ethnic minorities.

Gleeson said that recent conversations with regulators focused on issues such as built-in safeguards to prevent AI-led lending from charging higher rates to minorities who have typically paid more in the past.

An October roundtable convened by the Bank of England and the Financial Conduct Authority discussed an ethical framework and training around AI, including some human oversight and a requirement that banks can clearly explain the decisions taken.

One executive at a large UK bank said everyone in the industry was “thinking about and doing work on” how to deploy AI in an ethical way. Others said their banks were at an early stage of exploring how to use it.

UK Finance, the lobby group, said it recognised “the essential need to maintain public trust” as the industry explored the use of AI and acknowledged “potential unfair bias” was an issue. The Prudential Regulation Authority and FCA declined to comment.

Leave a Reply

Your email address will not be published. Required fields are marked *