Banks and other large financial firms are increasingly embedding regulatory requirements in software testing to ensure smooth compliance with internal and external audits.
While there seems to be a maze of rules and regulations, which is ever-growing, compliance does not have to be complicated, argues Chris Haggan, head of product DevOps at HCLSoftware.
During a 45-minute webinar on December 12, Haggan, together with Jonathan Harding, senior product manager at HCL DevOps Velocity and Stewart Morin, chief operating officer at Sandhata Technologies, will be discussing how firms can benchmark improvements in DevOps, collecting data from possibly dozens of different DevOps teams for value-stream mapping.
Haggan and Harding will review some key US and European regulations – notably Sarbanes-Oxley and GDPR – to highlight important requirements for data privacy, cyber security and IT change process management.
Using real world case studies of financial firms, they will demonstrate how these requirements can be embedded in your SDLC with the HCL DevOps Velocity platform.
Finally, data management is also increasingly a key challenge for financial institutions, so Haggan and Harding will also show how your vast amounts of data can be automated for testing with HCL DevOps Test and the webinar will feature an in-depth use case at a leading international bank.
Ahead of the online seminar, QA Financial caught up with Harding to get his take on these timely and pressing issues for many QA teams within banks and other financial services firms.
Firstly, how can banks and other large financial firms embed regulatory requirements in software testing for smooth compliance with internal and external audits?
One of the biggest risk factors encountered during system testing for financial systems is data leakage of Personally Identifiable Information, Personal Health Information, PCI Data, or Proprietary Information. Financial firms operating in personal banking, business banking, equity markets, fixed income, foreign exchange, etc can have millions of customers, financial products, counter parties, and other data elements that must be tested across all of a firms interconnected systems.
The temptation to use real data or even masked data for testing is very strong. Violations of data privacy regulations carry huge fines that can have a massive impact on the financial results for a firm. The best protection against data leakage or worse, bad actor exfiltration is to use a synthetic data creation tool.
How can firms benchmark improvements in DevOps, collect data from possibly dozens of different DevOps Product Teams for value-stream mapping?
Using a Value Stream Management Platform such as HCL DevOps Velocity can facilitate the collection, aggregation, normalization, and analysis of team performance data related to SDLC activities and events for dozens, hundreds, or even thousands of Software Delivery Teams, often called Product Teams. These teams are typically organized around a business function which is served by at least 1 I/T system, but possibly dozens of independently deployable application artifacts which are often interdependent.
“Data is a key risk factor in terms of compliance.”
– Chris Haggan
There are a number of metrics frameworks which can be used to measure team level performance, such as DORA metrics or the Flow Framework to name only two. In addition, HCL Devops Velocity includes the ability to correlate delivery metrics with business impact hypothesis captured in Product Management tools, thereby allowing teams to prioritize their work against the most important deliverables, and to see their business impact over time.
You stated earlier ‘compliance does not have to be complicated’. Can you give us a real-world example please?
One of the key challenges of compliance with Sarbanes Oxley for many IT organizations is that the regulation was not written with I/T compliance specifics in mind. For example, separation of duties requirements for SOX were written with financial bookkeeping in mind, but for I/T these regulations have been interpreted and universally adopted such that developers may not implement production I/T changes for which they have written the code or configuration change. Traditionally, IT SOX Audits must prove that an audited change was not implemented by a developer who has any code that’s been changed within that release.
What then of continuous delivery and fully automated build and deployment pipelines? How can one prove that a developer whose code change triggered a CI build and then was fully automated through all phases of testing, has not run afoul of separation of duties? One pattern that has been proven to stand up to separation of duties requirements is auditable peer review compliance. In other-words, a fully automated CI/CD pipeline can be proven SOX compliant if it can be proved all code changes have been fully peer reviewed, and there is linkage to documented requirements definition.
Finally, data management is a key challenge. How can vast amounts of data be automated for testing?
As mentioned previously, data is a key risk factor in terms of compliance and most if not all organisations these days are using some form of test data management solution as part of their DevOps processes. Rather than manage complex data masking and sub-setting rules to try and safely re-use production data, with HCL DevOps Test we’ve built a comprehensive data synthesis layer which enables companies to create as much lifelike data as they need for testing. This is particularly important when data volumes need to be high, such as during performance testing. Using synthetic data allows teams to maintain data integrity while being sure that unique data is available for scenarios such as account creation where uniqueness is key.
More information about the upcoming webinar can be found below.

QA FINANCIAL FORUM LONDON: RECAP
In September, QA Financial held the London conference of the QA Financial Forum, a global series of conference and networking meetings for software risk managers.
The agenda was designed to meet the needs of software testers working for banks and other financial firms working in regulated, complex markets.
Please check our special post-conference flipbook by clicking here.
READ MORE
- QA Financial Forum Chicago 2025: what to expect this afternoon
- QA Financial Forum Chicago 20205: a sitdown with GreatAmerica’s QA lead
- Taking place TODAY: the QA Financial Forum Chicago 2025
- HDFC Bank turns to Katalon and QualityKiosk for QA upgrade
- Perforce’s Clinton Sprauve on AI testing for charts and graphics
Become a QA Financial subscriber – for FREE
* Receive our weekly newsletter * Priority invitations to our Forum events