As enterprises launch new software and digital transformation initiatives to gain a competitive edge, practically every business is seeing massive technology disruption.
Often, these changes introduce new operational and economic challenges, and nowhere more so than in the highly regulated financial sector.
Now, a growing number of financial institutions are taking a fresh look at their lab and testing infrastructures to help address these issues.
Recently, QA Financial spoke with Anil Kollipara, vice president of product management at Spirent Communications, about what’s driving these challenges, and how lab and testing automation can help enterprises overcome them.
QA Financial: What pressures are financial institutions facing right now that are driving a need to rethink lab and testing approaches?
Anil Kollipara: As financial enterprises—really, all enterprises—expand their digital transformation initiatives, and add new cloud and AI applications, they’re achieving impressive gains. At the same time though, they’re dramatically increasing the complexity of the enterprise IT environment.
Most of these new tools use software-centric approaches (software-defined networking, virtualization, cloudification), that automatically scale infrastructure and resources. These approaches make enterprise environments far more agile and adaptable, but they also introduce many more network elements and vendors, and many more moving parts, all of which now get continuously updated. The result is a network up to 150x more complex than legacy environments.
Suddenly, there are many more places where something could go wrong—and more opportunities to fall out of compliance, disrupt customers, or suffer outages, with all the costs and penalties that entails. Something as simple as a routine security update could cause severe disruptions as in the CrowdStrike incident, which resulted in massive disruptions, costs, and penalties, including negative impact on millions of end customers.
“Something as simple as a routine security update could cause severe disruptions as in the CrowdStrike incident.”
– Anil Kollipara
The only way to deal with this complexity is to test more things, more frequently. But too many businesses try to do this the old-fashioned way, throwing bodies at the problem. Unfortunately, that approach won’t scale, both from a cost perspective and just in terms of covering all the test cases that enterprises now need to address.
QA Financial: What problems do enterprises encounter when trying to support modern infrastructures with traditional labs and testing?
Anil Kollipara: There are several. First, traditional labs were largely designed to support single-vendor physical networks. As a result, organizations often maintain many redundant lab instances. This creates specialized testing silos, each dedicated to a specific location, test team, or both, increasing CapEx. And since most lab equipment stays powered on at all times, even when not in use, it increases OpEx and carbon footprint too.
Additionally, these silos can impede interoperability and integration testing across labs and teams. Teams may duplicate efforts or fail to synchronize or communicate effectively, reducing productivity. Worse, this fractured testing too often fails to identify issues until after they’ve propagated into deployment in the live network, increasing risk of outages and other customer-facing problems.
Finally, traditional testing models typically have very long timelines. Setting up a new lab is a highly manual process that can take weeks and building and executing testing campaigns can take months. That’s far too slow for modern software-driven application environments that undergo constant change.
QA Financial: Why should financial institutions in particular be concerned about these issues?
Anil Kollipara: For two reasons. First, financial enterprises are often early adopters of new digital innovations. They face fierce competitive pressure to bring new capabilities to customers and maintain high-performing, highly reliable infrastructures for servicing them. As a result, financial institutions tend to be among the first to adopt new digital tools, and the first to grapple with the challenges those innovations introduce.
Second, enterprises in the tightly regulated financial sector face significant downsides if they don’t modernize. To meet compliance requirements, most financial institutions must certify their infrastructures multiple times per year. With so many devices, applications, and distributed physical environments to test, traditional manual approaches simply can’t keep up.
Additionally, unlike other businesses, when financial institutions have a service outage due to a lack of comprehensive testing, they don’t just take a hit to customer satisfaction. They may be subject to stiff financial penalties, mandated remediation steps, and increased scrutiny from regulators.
“Fractured testing too often fails to identify issues until after they’ve propagated into deployment.”
– Anil Kollipara
Q. How can institutions address these issues?
Anil Kollipara: In the end, the only way to get a handle on these challenges is by transforming your approach to labs and building the framework for continuous automated testing. That starts with connecting up all labs and networks to enable comprehensive virtual access to all testing resources from one place. Then, you layer both lab and testing automation on top of that framework to automate the whole process.
Now, whenever any change happens, such as software updates, security patches, etc., in any part of the network, that change automatically invokes a new test cycle before going live. This ensures that even as more changes happen in the network, each one is comprehensively tested, verified, and documented for regulatory auditing. You’ve reduced your risk of outages and compliance issues. Most important, you’re able to make the most of new cloud and AI investments to achieve business goals, while maintaining the uptime and performance your customers expect.
QA Financial: Does this kind of automation have cost implications as well?
Anil Kollipara: Absolutely. First, it reduces redundancy. By consolidating labs, there’s less expensive equipment to purchase, reducing Capex, and less underutilized equipment consuming power and space, reducing Opex and waste. You increase lab capacity and testing group productivity, even as you lower costs. You also accelerate testing timelines, from as much as 4-5 months to set up and run traditional testing environments to just a few hours. That’s a huge Opex savings, in addition to improving time-to-revenue for new products.
In the lab transformation efforts we’re engaged in now (including both lab and test automation), we’re seeing enterprises cut their required testing equipment in half while more than doubling testbed resource utilization. They’re setting up testbeds 300x more quickly and achieving 90% testing productivity gains. They’re reducing energy consumption across lab environments by more than 40%—saving more than $1 million annually for a typical 250-rack lab, while eliminating nearly 7,500 metric tons of CO2 emissions. And they’re cutting overall lab Capex and Opex by 60%. It’s not unusual for enterprises investing in lab and testing automation to realize a full return on investment (ROI) within 18 months, with tens of millions of net savings over three years.
Arguably the biggest benefit though, is that enterprises making these changes gain an agile, real-time testing capability that can keep up with today’s disaggregated, software-driven, multi-vendor networks. These institutions can continue to invest in digital transformation initiatives to gain a competitive edge, knowing that their testing coverage will continually, automatically scale with the business.
Want to learn more about the journey and benefits of global lab automation & management? Download the eBook, A Model for Enterprise Lab Transformation.
UPCOMING QA FINANCIAL EVENT
REGISTRATION IS NOW OPEN FOR THE QA FINANCIAL FORUM SINGAPORE 2024
Test automation, data and software risk management in the era of AI
The QA Financial Forum launches in Singapore on November 6th, 2024, at the Tanglin club.
An invited audience of DevOps, testing and quality engineering leaders from financial firms will hear presentations from expert speakers.
Delegate places are free for employees of banks, insurance companies, capital market firms and trading venues.
READ MORE
- Cognizant drags rival Infosys to court over trade secrets
- Testaify claims tool is ‘100x faster than seasoned QA architect’
- Fast-growing Newgen sets sights on banks in Middle East
- ABN Amro hires nCino and CBA for digital upgrade
- QAFF London: Lloyds’ Richard Bishop on the rise of ‘green software’
Become a QA Financial subscriber – for FREE
News and interviews * Receive our weekly newsletter * Get priority invitations to our Forum events