Quantum computing is still years away from supporting mainstream banking workloads, but the implications for testing are no longer theoretical.
For QA leaders and engineering teams, 2026 is shaping up to be the year quantum readiness quietly moves from research slides to the QA backlog, driven not by hype cycles, but by security pressure, encryption lifecycles and regulatory expectations around evidence.
This shift starts from a simple premise: if quantum is capable of breaking today’s encryption in the future, banks need to be able to prove now that they can migrate when required.

A recent survey found that “three in five financial services organisations are already prototyping post-quantum cryptographic algorithms,” indicating that crypto-migration rehearsals are landing in test planning long before any quantum hardware enters production.
Haim Israel of BofA Global Research described the scale of the shift, calling quantum computing “one of the biggest revolutions yet,” enabled by “sub-atomic particles to store information” and “superpositions for complex calculations.”
Moreover, Chad Rigetti, chief executive of Rigetti Computing, framed the technology as “perhaps the most sophisticated technology that humans have ever built,” adding that “Wall Street’s antennae are twitching” as breakthroughs move closer to applied use.
Those reactions are not being met with quantum software rollouts, but with test-plan rewrites.
Vancouver-based Margarita Simonova, founder of ILoveMyQA.com, highlighted the foundational change for QA teams: “quantum computers utilise qubits, which can exist in multiple states simultaneously,” meaning quantum outputs are probabilistic rather than binary.

“Testing professionals must throw out the idea of a single ‘correct’ answer,” she said, because quantum software returns results with probabilities attached, not deterministic pass/fail outputs.
For QA, that means rethinking correctness as confidence rather than conclusion.
Quantum’s proximity to high-performance computing is another reason it is entering QA strategies. Israel noted that “HPC systems can perform quadrillions of calculations per second,” signalling that the road to quantum flows through the same infrastructure banks already rely on for analytics, testing automation at scale and model validation. In practice, this is expanding the test surface area, not replacing it.
Early experiments illustrated why QA needs to be included now. NatWest used “quantum-inspired” computing to run a portfolio calculation “at 300 times the speed of a traditional computer,” supporting decisions on a £120bn high-quality liquid assets portfolio.
Director of Innovation Kevin Hanley was keen to stress that the bank was “really excited about the possibilities that quantum computing presents us with,” arguing that it could “completely change the way banks operate, making them much more efficient and cheaper to run.”
Efficiency gains at that scale trigger test debt: every acceleration exposes gaps in monitoring, data integrity, and reproducibility.
First quantum KPI
As banks explore quantum acceleration, security teams are outpacing them. HSBC’s global head of quantum technologies, Philip Intallura, is clear that future-proofing starts with thinking about migration today: “you have to make your new software platforms crypto-agile so you can rotate between different algorithms.”
The concept of crypto-agility is now entering QA lexicons as testing teams are asked to validate hybrid architectures and dual-encryption models.
This shift is being reinforced externally. Singapore’s central bank has launched sector-wide initiatives to “develop and roll out quantum security capabilities,” including sandbox projects to test quantum key distribution on financial networks.
These trials are designed to “inform and shape technology and cyber risk management policies towards quantum-proofing our financial systems,” indicating that quantum evidence could soon be a supervisory requirement.
In Britain, UK Finance has already warned that quantum computing could undermine existing payment-system security. Regulatory direction of travel is becoming a practical planning trigger.
Proving resilience, not just testing it
Barclays’ work with IBM on quantum algorithms for clearing demonstrates that quantum will complicate assurance before it simplifies workflows.
Executing trade-netting logic at quantum scale created new constraints, with only 16 qubits available. The project team described how “better algorithms increase settlement efficiency,” but also how complexity forced abstraction layers and hybrid designs that QA would need to validate.
“We’re looking forward to packaging more algorithms with quantum computing,” said senior architect Lee Braine, emphasising iteration rather than replacement.
Testing those hybrids requires new definitions of success. Instead of deterministic baseline comparisons, QA teams will need validation of probability distributions over single answers, error-rate monitoring as a performance variable, crypto-migration rehearsal environments, and test evidence aligned to regulatory terminology.
This is why quantum is entering the QA roadmap before quantum enters production.
Toward a non-deterministic future
In conclusion, the industry’s tone has shifted. As one QA Financial analysis put it recently: “even as AI dominates investment cycles, quantum computing is reshaping long-term testing priorities,” with post-quantum cryptography emerging as a resilience metric.
The timeline is uneven as many analysts expect the most useful quantum banking applications to land around 2030–2035, although the backlog is filling now.
For QA leaders, the message is blunt: AI changed how software is built; quantum is changing what banks must be able to prove about it.
In 2026 that proof is likely to take the form of crypto-agile architectures, quantum-aware test strategies, encryption-migration rehearsals and resilience drills designed for a probabilistic era.
Quantum computing may not be ready for banks. But banks, and their regulators, are getting ready for quantum. And QA is where that preparation becomes real.
QA FINANCIAL EVENTS


Why not become a QA Financial subscriber?
It’s entirely FREE
* Receive our weekly newsletter every Wednesday * Get priority invitations to our Forum events *
REGULATION & COMPLIANCE
Looking for more news on regulations and compliance requirements driving developments in software quality engineering at financial firms? Visit our dedicated Regulation & Compliance page here.
READ MORE
- Why real-time monitoring and scenario testing are becoming core QA disciplines
- BankDhofar takes an automated approach to strengthen QA
- Banks warned AI still fails on real-world software testing tasks
- SEC’s AI emphasis drives new QA and testing imperatives for US banks
- Inside the chaos: The new reliability discipline reshaping banking QA
WATCH NOW

QA FINANCIAL PODCASTS

Listen to Sudeepta Guchhait on Nasdaq’s new Mimic AI testing platform
QA Financial sits down with Sudeepta Guchhait, Senior Director of Product Framework & Quality Engineering at Nasdaq
——–
Listen to Wesley Scheffel and Robin Rain on Schroders’ DevOps strategy
We catch up with Wesley Scheffel, Head of Cloud Platform and Product Engineering at Schroders, and Robin Rain, Head of Cloud Platform Architecture
——–
Listen to Citi’s Jason Morris on Lightspeed and the future of continuous delivery
Jason Morris, Head of Developer Pipelines for Securities Markets and Banking at Citi, talks about Lightspeed
