When Simplyhealth embarked on a five-year modernisation programme, it was not just another IT refresh. For the UK’s leading member-owned healthcare plan provider, the transformation meant rebuilding claims systems, introducing deep automation and rethinking how digital clinical journeys are stitched together end-to-end.
At the heart of that work is Chief Technology Officer Tim Gough, who has overseen a shift to more than 80% claims automation while modernising both infrastructure and processes.
Following his opening keynote at this week’s QA Financial Healthcare & Insurance Forum London 2025, Gough shared his lessons for QA leaders trying to navigate the most complex, regulated and data-sensitive software environments in the industry.
Re-engineering from the ground up
Gough is unequivocal that major digital change in healthcare begins with people and process before code. “First and foremost, it’s a people transformation, as much as a technology transformation,” he said.
For Simplyhealth, the most significant progress came when the team stopped viewing the programme as a platform replacement and began treating it as a full-process rebuild.
“What you’ve got to do is look to re-engineer everything from the ground up process wise, because if you just replace the platform, you’re just going to get the same platform again at the end of it,” he said.
The biggest risk, he warned, is modernising the interface while leaving legacy logic intact. “Otherwise you just end up with another legacy platform that’s doing exactly the same thing as the old one did and that doesn’t win for anybody.”
Building connected healthcare
Gough’s opening session next month will focus on “connected healthcare”, a term often invoked, but rarely defined clearly. For him, the concept is simple: make the complexity disappear for the customer, while orchestrating an enormous amount of work behind the scenes.
“They don’t know what a hugely disconnected environment is behind the scenes,” he said.
“If we can ensure that all of our customers just don’t see the amount of work we have to do to make that sort of thing work, we’ve got a huge team of clinical experts and technicians who kind of pull that patchwork of disparate kind of clinical interventions into a really simple single customer journey.”
That also means working with, not around, the National Health Service. “We absolutely see ourselves not as [a competitor]. We are trying to make sure how do we help people find the right services at the right time when they need it.”
WATCH our recent podcast with Tim Gough
Interoperability, Gough continued, is both a technical and partnership challenge. “Allow partners who provide those clinical services to plug into us as quickly as possible and allow us to scale and add those into the journey as easily as possible.”
“Customers can flow right through the process without feeling the pain.
New era of AI testing
Artificial intelligence, Gough believes, is moving from clinical decision support into the QA and DevOps toolchain itself. He stresses that AI is not new – “it’s always been used for a long time. It’s just got more and better” – but the current wave is different.
“AI nowadays and in the recent history is much more targeted around the term, around generative and agentic experiences and QA,” he said.
For quality engineers, this means tools that can remove repetitive work and finally deliver on the promise made years ago by robotic process automation.
“Some of the modern, the newer agentic solutions we see actually deliver some of what robotic process automation kind of promised… and really helps build that test automation that’s so critical to delivering good quality outcomes.”
Crucially, Gough argues, these systems allow testers to move further left.
“The ability to take those tests and expand them and just make them more repeatable… allow the quality engineers to define the test and help earlier in that cycle of getting the quality right at front of the development cycle, not just be testers at the other end of it.”
Future benchmark
Looking ahead, Gough sees two parallel imperatives: uncompromising data protection and resilient performance. “Customer data is so critically it can’t be lost, it can’t be leaked… that’s kind of the number one thing that you just have to keep,” he said.
But cyber security is only the foundation. Availability, graceful degradation and user-invisible failure handling are becoming the real differentiators.
“It always has to come back to that customer experience of how people don’t see any impact to their usage of an application or a solution,” he said. That requires engineering quality into the early lifecycle stages. “If something fails, how well does it fail? How well does it recover?”
For QA teams, that means designing test suites that simulate not just success paths, but operational stress, integration failure and partial-system degradation, long before go-live.
Why not become a QA Financial subscriber?
It’s entirely FREE
* Receive our weekly newsletter every Wednesday * Get priority invitations to our Forum events *


REGULATION & COMPLIANCE
Looking for more news on regulations and compliance requirements driving developments in software quality engineering at financial firms? Visit our dedicated Regulation & Compliance page here.
READ MORE
- Inside JPMorgan’s $18bn QA push with OmniAI reshaping testing
- As AI takes hold, insurance firms face a new testing mandate
- K2view’s Amitai Richman calls out the ‘real bottleneck’ in healthcare and insurance
- AI in QA: how flexible testing is redefining assurance for financial firms
- Explainer: Why site reliability engineering is gaining momentum in banking
WATCH NOW

