Trends & Research

Trends & Research

Access the power of data and objective insight. Data from various sources, including NEACH surveys and member interviews, is compiled and made available as white papers, case studies, articles, benchmarking, and industry reports to provide a snapshot of both the current and future payments landscape. 

Published on Monday, October 30, 2023

CFPB Issues Guidance on Credit Denials by Lenders Using Artificial Intelligence

Payments Report: News from Washington, Brought to you by NEACH
VOLUME 2023-8 (OCT 23)

 

Overview: On September 19, 2023, the Consumer Financial Protection Bureau (“CFPB”) issued guidance describing procedures that lenders must follow when using artificial intelligence (“AI”) and other complex credit underwriting models (the “Guidance”). The Guidance states that lenders must provide specific and accurate reasons for denying credit when using AI models. This means that lenders may not be able to use the CFPB’s sample adverse action forms if those forms do not include the actual reason for denial of credit. The Guidance builds upon previous CFPB policy statements concerning AI and suggests the CFPB remains skeptical of the use of AI models.

Background  

In the past two years, the CFPB has issued a series of interpretations and policy statements concerning AI. These statements have primarily focused on the intersection of AI and fair lending principles, including the CFPB’s concern that lenders’ use of AI and other complex models could have unintended discriminatory effects. For example, in one of his first public statements after assuming leadership of the CFPB, Director Rohit Chopra expressed concern that AI models could exacerbate lending bias and indicated that the CFPB would conduct investigations of lenders’ use of AI.

The CFPB’s primary fair lending authority comes from the Equal Credit Opportunity Act ( “ECOA”). In addition to prohibiting lenders from discriminating in providing credit, ECOA requires lenders to provide notices to consumers when taking adverse action on credit applications. ECOA requires that these adverse action notices include a statement of the specific reasons a lender took adverse action.

In May 2022, the CFPB issued guidance affirming that ECOA requires lenders to explain the specific reasons for taking adverse actions, even when their use of AI and other complex algorithmic underwriting models may make it difficult to identify those reasons with precision. The September 2023 Guidance expands upon the May 2022 guidance.

 

The Guidance Limits the Utility of Sample Adverse Action Forms

The regulation that implements ECOA, Regulation B, has long included sample adverse actions notice forms with a checklist of approximately a dozen potential reasons a lender may select from in describing why it has taken adverse action on a credit application. The forms also include a blank “other” category that lenders may use to customize the sample form. Most lenders use these sample forms, or close adaptations of the forms, to provide adverse action notices to consumers.

The Guidance clarifies the CFPB’s view that the sample forms are merely illustrative and may not be appropriate for all lenders. It further clarifies that reliance on the checklist of reasons in the forms will only satisfy a lender’s adverse action obligations under ECOA if the reasons disclosed specifically describe the principal reason(s) the lender took adverse action on a credit application.

In particular, the Guidance states that a lender may not rely solely on the unmodified checklist of reasons in the sample forms if the reasons provided on the sample forms do not reflect the principal reason(s) for the adverse action. If the principal reason(s) a lender actually relies on is not accurately reflected in the sample checklist of reasons in the sample forms, the lender must either modify the sample form or check “other” and include an appropriate, specific explanation.

Lenders’ Use of Alternative Data and “Black-Box” Algorithms

In the CFPB’s view, it is increasingly likely that lenders using AI or other complex underwriting models may not be able to rely simply on the list of reasons in the sample adverse action notice checklist. The sample forms generally refer to traditional credit underwriting practices, and may not contain sufficient specificity where lenders’ AI models incorporate “alternative” data gathered outside of a credit application or credit report. For example, the CFPB states that if a credit denial results from an AI model that incorporates a consumer’s profession, providing a statement that the consumer had insufficient income (which is a reason included on the sample form) would likely not be precise enough to meet the lender’s obligation under ECOA.

The Guidance also emphasizes that adverse action notices may be required for consumers with existing credit lines if, for example, a lender decides to lower a consumer’s credit limit or close a consumer’s credit account altogether. In circumstances where a lender uses behavioral data to take that form of adverse action, specific disclosures are required. For example, if a lender takes adverse action based on the stores where a consumer shops or the types of goods a consumer purchases, it would likely be insufficient for a lender to simply state that “purchasing history” is the reason for adverse action. Instead, a lender would likely need to disclose specific details about a consumer’s purchasing history, such as the type or location of the store, the type of goods purchased, or other relevant information.

The Guidance also makes clear that the in the CFPB’s view, lenders must provide specific adverse action notices regardless of the technology used to arrive at credit decisions. For example, lenders still have an obligation under ECOA to provide a notice that includes the specific reason(s) for adverse action even when using a so-called “black box” algorithm that makes it challenging to ascertain the specific reason a model recommends denial of credit. In the CFPB’s view, if a model cannot provide a specific reason for adverse action, it may not be appropriate for a lender to use that model in consumer credit decisioning.

Outlook: The Guidance continues the CFPB’s series of policy statements and interpretations expressing skepticism of the use of AI and “black box” models in the consumer financial services industry. Lenders using such models should prepare for the CFPB to focus its supervisory efforts on compliance with the ECOA and other existing consumer financial services laws.

 

______________________________________________________________________________________________________________________

AUTHOR INFORMATION:

Craig Saperstein, a member of Nacha’s Government Relations Advisory Group, is a partner in the Public Policy practice of Pillsbury Winthrop Shaw Pittman LLP in Washington, D.C. In this capacity, he provides legal analysis for clients on legislative and regulatory developments and lobbies congressional and Executive Branch officials on behalf of companies in the payments industry. Deborah Thoren-Peden is a partner and member of the Financial Institutions Team at Pillsbury Winthrop Shaw Pittman LLP. She provides advice to financial institutions, bank and non-bank, and financial services companies. Daniel Wood is a Counsel and member of the Financial Services Regulatory Team. He provides analysis for financial institutions, technology companies, and clients that offer consumer financial products. Brian Montgomery is a Senior Counsel and member of the Financial Services Regulatory Team. He provides analysis for financial institutions, technology companies, and clients that offer consumer financial products. The information contained in this update does not constitute legal advice and no attorney-client relationship is formed based upon the provision thereof.

 

 

Rate this article:
No rating
Comments (0)Number of views (706)
Print

Theme picker