Standard Chartered legal head flags urgent push for AI stress testing

Singapore-based John Ho

A legal head at one of the world’s biggest banks has highlighted growing pressure on UK regulators and financial services firms to treat AI testing and stress testing as an urgent resilience priority, following a recent warning from MPs that the sector is not prepared for an “AI-driven market shock”.

John Ho, Head of Legal, Financial Markets at Standard Chartered Bank, pointed this week to the findings of the House of Commons Treasury Committee, which concluded that artificial intelligence is already deeply embedded across banking and insurance, but that governance and supervisory frameworks are lagging behind adoption.

Dame Meg Hillier

“According to evidence received by the Committee, more than 75% of UK financial services firms are now using AI, with the largest take-up among insurers and international banks,” Ho wrote on LinkedIn.

He stressed that AI is being deployed “to automate administrative functions and to deliver core services such as processing insurance claims and credit assessments.”

Ho’s comments come weeks after the UK Treasury Committee published its report urging the Financial Conduct Authority (FCA), the Bank of England and HM Treasury, the British Ministry of Frinance, to move faster in developing AI-specific safeguards, warning that existing supervisory approaches are not yet equipped to capture the systemic risks AI could introduce.

Committee chair Dame Meg Hillier said the current approach “is exposing consumers and the financial system to potentially serious harm” as AI adoption accelerates across financial services.

From theory to regulatory expectation

For QA and software testing teams inside banks, the renewed attention from senior industry figures underscores that AI is no longer being treated as a marginal innovation issue.

British MPs argued that without explicit AI stress testing, firms and regulators remain ill-prepared to understand how AI systems could amplify operational failures, cyber-incidents and market volatility.

The Bank of England

“The Treasury Committee believes that action is needed to ensure that this is done safely,” Ho wrote, highlighting one of the report’s most significant recommendations: “for the Bank of England and the FCA to conduct AI-specific stress-testing to boost businesses’ readiness for any future AI-driven market shock.”

The Committee warned that while regulators already run extensive cyber and operational resilience exercises, these are not yet designed to test AI-driven failure modes.

“Crucially, however, the Bank of England and the FCA do not conduct AI-specific cyber or market stress testing,” the report stated.

To address that gap, MPs issued what they described as a clear expectation for the supervisory agenda, calling “to build firms’ readiness for AI-driven market shocks, the Bank of England and the Financial Conduct Authority must conduct AI-specific stress testing.”

For banks, that recommendation has direct implications for internal QA strategy. AI-specific stress testing would require firms to simulate failures in machine-learning models, data pipelines and automated decision controls, while demonstrating that systems remain explainable, controllable and recoverable under severe stress.

Ho also stressed that lawmakers recognised AI’s potential upside, but warned that benefits depend on stronger oversight.

Jonathan Hall

“The report noted that AI and wider technological developments could bring considerable benefits to consumers,” Ho wrote, adding that “the Committee, therefore, encourages firms and the Financial Conduct Authority (FCA) to work together to ensure that the UK capitalises on AI’s opportunities.”

However, MPs argued that AI introduces new systemic risks precisely because similar automated models may be deployed across multiple firms, reinforcing each other during periods of disruption. Automated decision-making, algorithmic trading and AI-driven analytics could accelerate instability when markets are already under strain.

Jonathan Hall, an external member of the Bank of England’s Financial Policy Committee, told the inquiry that AI-specific stress testing could be “extremely valuable”.

He suggested AI-driven scenarios should be incorporated into future system-wide market stress tests.

Third-party AI oversight

Cyber resilience was another core theme of the report. Regulators already conduct Cyber and Operational Resilience Stress Tests, but MPs warned that AI changes the cyber-threat landscape by expanding attack surfaces and introducing new failure modes.

“AI heightens cyber-security vulnerabilities, increasing the volume and scale of cyber-attacks against the financial services sector,” the report stated.

For QA, security testing and resilience leaders, this points to the need to integrate AI assurance directly into cyber-stress testing programmes, validating how models behave during live attacks, how quickly failures are detected, and whether automated responses introduce additional risk.


“There should be a clearer explanation of who in finance organisations should be accountable for harm caused through AI.”

– John Ho

Beyond stress testing, Ho highlighted the Committee’s call for clearer accountability frameworks around AI deployment.

“The Treasury Committee is also recommending that the FCA should publish practical guidance on AI for firms by the end of this year,” he wrote, adding that it should include “how consumer protection rules apply to their use of AI” and set out “a clearer explanation of who in those organisations should be accountable for harm caused through AI.”

For software testing and governance teams, that implies tougher expectations around documentation, auditability, ownership and control validation as AI becomes embedded in core banking activity.

The Committee also focused on the growing dependence on third-party technology providers underpinning AI deployments.

“The Critical Third Parties Regime was established to give the FCA and the Bank of England new powers of investigation and enforcement over non-financial firms which provide critical services to the UK financial services sector, including AI and cloud providers,” Ho noted.

“The Committee urges the Government to designate AI and cloud providers deemed critical to the financial services sector in order to improve oversight and resilience.”

For financial services QA teams, that signals that resilience testing requirements may increasingly extend beyond internal systems, forcing banks to validate not only their own AI models but also the robustness of external platforms and providers on which those models rely.


QA FINANCIAL EVENTS



Why not become a QA Financial subscriber?

It’s entirely FREE

* Receive our weekly newsletter every Wednesday * Get priority invitations to our Forum events *

REGISTER HERE TODAY


REGULATION & COMPLIANCE

Looking for more news on regulations and compliance requirements driving developments in software quality engineering at financial firms? Visit our dedicated Regulation & Compliance page here.


READ MORE


WATCH NOW


QA FINANCIAL PODCASTS

CLICK HERE TO LISTEN TO OUR EXCLUSIVE CONVERSATIONS