The Bank of England and FCA are encouraged to implement AI stress tests.

dominic Avatar

Britain’s parliamentary Treasury Committee has urged for a more proactive approach to overseeing artificial intelligence (AI) within financial services.

Committee members cautioned that the current regulatory practices may not be sufficient as AI systems become increasingly integral to the sector. In their recent report, they suggested that the Financial Conduct Authority and the Bank of England develop specific stress-testing frameworks for AI-driven systems. These tests are intended to help banks, insurers, and other financial firms understand how automated decision-making tools might react during periods of market disruption or technological failure.

The report further recommended that the Financial Conduct Authority provide clearer guidance on how existing consumer protection rules apply when AI is used in areas such as credit assessments, insurance pricing, and customer interactions. The committee also called for guidelines outlining the expected level of understanding senior managers should have regarding AI systems under their purview, with a suggestion that these guidelines be published by the end of 2026.

AI Adoption Poses Consumer and Stability Risks

Evidence presented to the committee revealed that three-quarters of British financial services firms are now utilizing AI in core operations, including claims handling and lending decisions. Regulators recognized that more advanced forms of AI, especially those capable of acting autonomously, present additional risks for retail customers.

The report highlighted concerns regarding opaque decision-making processes, potential discrimination against vulnerable consumers, increased fraud risks, and the spread of unregulated financial advice through AI-powered tools. Witnesses also raised broader financial stability issues, noting that a reliance on a few US-based cloud and AI providers could create concentration risks. In trading contexts, automated systems were said to potentially amplify herd behavior during market stress.

Treasury Committee representatives indicated that, based on the evidence reviewed, the financial system may not currently be adequately prepared for an AI-related incident that could have far-reaching consequences for consumers.

In response, officials from the Financial Conduct Authority stated that they would review the findings. They reiterated their existing stance that rigid AI-specific rules might struggle to keep up with technological change. Bank of England representatives acknowledged ongoing work to assess AI-related risks and promised a formal response to the recommendations.

Latest Posts