AI Regulation in Finance: 6 Moves CISOs and CTOs Should Make Right Now to Prepare for the CFPB

January 2, 2024
Authored by
Airlie Hilliard
Senior Researcher at Holistic AI
AI Regulation in Finance: 6 Moves CISOs and CTOs Should Make Right Now to Prepare for the CFPB

Throughout 2023, the Consumer Financial Protection Bureau (CFPB) dropped hints to suggest it’s gearing up for a more central role in AI regulation. The CFPB isn’t anything new. If you’re in legal, compliance, or infoSec in finance or banking you’ve likely dealt with their requirements many times.

In the past the CFPB has been less active weighing in on AI compared to other regulators. Nevertheless, at several key moments across 2023, the CFPB is indicating that it could now ramp up efforts to address AI and protect consumers from harm.

In this guide, we’ll look at the key actions the CFPB took in 2023 regarding AI regulation, as well as patterns in past CFPB regulation that could help AI GRC leaders start preparing now. This will culminate in four suggestions CISOs and CTOs should take today to de-risk CFPB and improve AI efficacy.

Skip to:

What CFPB's past tells us about future AI moves

The CFPB has a wide-ranging mission to “protect consumers from unfair, deceptive, or abusive practices and take action against companies that break the law.” This leaves the agency with a great deal of discretion about what they focus on.

While they signaled an intent to focus on a few key areas of AI in 2023, we can look to past trends at the agency to try and discern the potential full range of their regulatory focus.

Prominent focus areas of the CFPB:

  1. Unfair, Deceptive, or Abusive Acts and Practices (UDAAP): This broad category encompasses any financial practice that harms consumers or takes advantage of them. The CFPB has consistently cracked down on UDAAP in areas like predatory lending, debt collection harassment, hidden fees, and deceptive marketing.
  2. Mortgage Lending: The CFPB played a crucial role in reforming the mortgage market after the 2008 crisis. They continue to regulate unfair mortgage practices, protect borrowers from discrimination, and ensure transparency in mortgage lending processes.
  3. Student Loan Servicing: The CFPB has been active in investigating and addressing complaints about student loan servicers who engage in misleading practices, unfair repayment options, and inadequate customer service.
  4. Consumer Credit Reporting: The CFPB oversees credit reporting agencies to ensure accurate and fair credit reporting practices. They have implemented regulations to improve the dispute resolution process for consumers and crack down on inaccurate credit reports.
  5. Emerging Technologies: As new financial technologies and products emerge, the CFPB stays vigilant in identifying potential risks and crafting regulations to protect consumers. Recent areas of focus include fintech lending, cryptocurrency platforms, and robo-advisors.

As we’ll see in the first announcement below, the CFPB is aligned with other US agencies in noting that existing laws apply to AI. This means it’s a solid bet that the CFPB may focus initial regulation of AI around their existing focus areas where they have teams and enforcement capabilities already in place.

Additionally, the CFPB routinely works with other agencies and governmental bodies. Below are some areas at the intersection between what other agencies are focusing on and past CFPB focus:

  • Automated debt collection tools have been under scrutiny in a range of locations around the world. Most notably the Australian government had to pay a $1.7B fine in what was judged to be illegal automated debt collection.
  • Robo-advisors (chatbots) have been a focus of the CFPB since before the current generation of generative AI. The lack of guardrails and potential for personal identifiable information (PII) to pass through these interfaces makes generative AI a likely focus.
  • Frameworks and legislation to prevent bias in insurance quote adjustment and hiring could be extended to automated systems handling credit applications.

All we know for sure are their existing statements covered below. But there’s reason to believe these focus areas could easily be future highlights of CFPB’s regulatory agenda and much to be gained by not being caught unprepared.

Announcement: CFPB joint statement on AI and existing laws

In April 2023, the Consumer Financial Protection Bureau (CFPB) - in collaboration with the Equal Employment Opportunity Commission (EEOC), the Department of Justice’s Civil Rights Division (DOJ), and the Federal Trade Commission (FTC) - issued a joint statement on AI and automated systems.

While the joint statement did not announce the advancement of any AI-specific regulatory efforts, it did reiterate that enforcement powers of the federal agencies also apply to AI and automated decision making, with a number of agencies having already taken enforcement action against AI under their existing enforcement authority.

In particular, the joint statement highlighted that multiple components of these tools – including data and datasets, model transparency, and deployment – can all lead to violations of federal laws if they are not considered or accounted for throughout the entire lifecycle of AI systems.

Rule Proposal: Proposed rule for AI in home appraisals

On 1 June 2023, the CFPB proposed a new rule, titled Quality Standards for Automated Valuation Models.

The rule’s aim is to promote fairness and accuracy in home appraisals made using AI in conjunction with the Board of Governors of the Federal Reserve System, Federal Deposit Insurance Corporation, Federal Housing Finance Agency, National Credit Union Administration, and Office of the Comptroller of the Currency.

Under the rule’s terms, automated valuation models are computerized models used by mortgage originators and secondary market issuers to determine the collateral worth of a mortgage on a property. This effectively implements the quality control standards mandated by the Dodd-Frank Wall Street Reform and Consumer Protection Act (Dodd-Frank Act) and applies them to AI.

Under the proposed rule, institutions would have flexibility to set quality controls for AVMs based on their size and risk, as well as the complexity of the transactions for which the AVM will be used. Comments on the new section 1125 were open until 31 August 2023 after being published in the Federal Register at the end of June 2023.

Ongoing research: CFPB spotlight on conversational AI in banking

On 6 June 2023, the CFPB issued a spotlight on the use of AI-driven chatbots in banking. While the spotlight also did not advance any regulatory activities, it did highlight a number of concerns surrounding the use of AI in banking activities. In particular, concerns were raised about the efficacy of conversational AI to answer customer questions, with the chatbots performing better for simple tasks while being unable to provide satisfactory responses for more complex tasks.

Moreover, there are concerns about privacy and security risks resulting from the use of AI chatbots in banking due to the sensitive nature of the data being handled. As such, the spotlight reminds banks that they are liable for the inputs and outputs of any chatbots they use, and that they are responsible for compliance with relevant laws.

New requirements: CFPB Circular 2023-03 on adverse action notices

More recently, on 19 September 2023, the CFPB published Circular 2023-03, which clarified requirements for adverse action notices provided by creditors, whereby creditors taking adverse action against an applicant must supply a statement with the specific reasons for taking that action.

This is enforced by the Equal Credit Opportunity Act (ECOA), which is implemented by Regulation B. Under these terms, it is unlawful for any creditor to discriminate against any applicant with respect to any aspect of a credit transaction based on protected attributes.

The CFPB provides sample forms for issuing adverse action (that is, a decision by a creditor that negatively affects a consumer’s request for credit or existing credit arrangement), but the circular letter was published to remind creditors that the reasons provided in the statement must be specific and accurate. As such, creditors cannot rely on an unmodified checklist to provide notice if the reasons provided on the sample notice are not relevant to that specific reason for adverse action. This is particularly important when using AI and automated decision making, since the templates may not provide applicable reasons for taking adverse action.

Circular 2023-03 follows on from Circular 2022-03, which also addressed how to provide adverse notice actions when using complex algorithms to make credit decisions. Here, the circular confirmed that adverse action notices must be provided even when using complex algorithms. It asserted that such complex algorithms should not be used if it would prevent creditors from providing accurate notices when adverse action was taken against an applicant.

Checklist for staying ahead of CFPB regulations

✅ You can’t respond to regulation if you don’t know where you AI “lives”. Start with robust AI inventorying and registration of new deployments

✅ Create a notification for CFPB in your AI tracker account

✅ Don’t “tunnel vision” on CFPB and keep a holistic view of finance regulation as a whole (note the EU AI Act, CA, Colorado, and NYC-specific laws)

✅ Establish routine auditing for bias, privacy, efficacy, robustness, explainability (our platform lets you do this on autopilot)

✅ Build team-wide awareness of past and present finTech regulation (news monitoring and upskilling are key)

✅ Build guardrails and governance into your LLM applications

Time to get compliant

Is your organization developing or deploying AI in the financial services sector?

Regulators and lawmakers are ramping up their efforts to regulate AI, both in your industry and beyond.

Make sure you are equipped to navigate existing and emerging legislation with Holistic AI.

Our Governance, Risk, and Compliance Platform and suite of innovative solutions are trusted by global companies.

Schedule a call with one of our AI policy experts to enable your financial AI systems with heightened efficacy, adoption, and trust.

DISCLAIMER: This blog article is for informational purposes only. This blog article is not intended to, and does not, provide legal advice or a legal opinion. It is not a do-it-yourself guide to resolving legal issues or handling litigation. This blog article is not a substitute for experienced legal counsel and does not provide legal advice regarding any situation or employer.

Subscriber to our Newsletter
Join our mailing list to receive the latest news and updates.
We’re committed to your privacy. Holistic AI uses this information to contact you about relevant information, news, and services. You may unsubscribe at anytime. Privacy Policy.

Discover how we can help your company

Schedule a call with one of our experts

Schedule a call