How an imbalanced test automation strategy hurts business ...

11
Whitepaper How an imbalanced test automation strategy hurts business agility IT leaders and DevOps practitioners: train teams for balanced automation practices throughout software delivery to maximize agility, minimize risk and reduce cost. Jason English Principal Analyst, Intellyx October 2020

Transcript of How an imbalanced test automation strategy hurts business ...

Whitepaper

How an imbalanced test automation

strategy hurts business agility

IT leaders and DevOps practitioners: train teams for balanced

automation practices throughout software delivery to maximize

agility, minimize risk and reduce cost.

Jason English Principal Analyst, Intellyx

October 2020

2

Whitepaper: Imbalanced Test Automation Hurts Agility

©2020 Intellyx LLC. https://intellyx.com

Anyone involved in software delivery will tell you that automation

– wherever possible -- is an inherently good thing. But can a

devotion to test automation for its own sake somehow throw our

software lifecycle out of balance?

We’re all becoming acutely aware lately of the need for balance in assuring our own

health, and the the health of those we care about. We consume content from

epidemiologists, nutritionists and personal trainers in search of this balance.

The performance of our body has a lot to do with achieving balance over the course of

time. Controlling caloric intake is important for a diet, but you also need the right

amounts of whole food fats and cholesterols for long-term success. Exercise is

beneficial when training for a sport, but taking it to extremes without enough rest

could actually weaken the body’s recovery ability and cause injuries in the field. You

may need vitamin D and potassium to preventatively deal with an infection, but too

much of these compounds could be dangerous.

That’s not even addressing our state of mind, which can either stabilize us, or produce

the most destructive effects on our health, if suddenly subjected to undue mental

stress.

Automation in the software delivery

lifecycle, especially test automation,

follows a similar need for a balanced

strategy as many of the proactive

practices we might adopt for improving

our own personal health.

As software inevitably becomes more complex and distributed over time, potential

failures appear everywhere. Therefore, we will always need ever-increasing test

automation for static analysis, regressions, performance and functional validation.

3

Whitepaper: Imbalanced Test Automation Hurts Agility

©2020 Intellyx LLC. https://intellyx.com

Automation of builds, tests, deployments and observability are all beneficial for the

health of our software -- but only in moderation. We can replace most functional UI

tests with automation for instance, but the need for some level of manual UAT and

human verification will always remain.

If too much test automation is applied, at the least opportune times, for the wrong

reasons, the software lifecycle – and with it, your business agility – will be pushed out of

balance. This paper will explore how test engineering and development practices can

achieve a healthy balance of test automation, by automating the right kinds of tests at

the right time, at the right place, and with the right resources for success.

Challenges: How test automation

can become counterproductive

“Approximately 67% of the test cases being built, maintained, and executed are

redundant and add no value to the testing effort.”

-- Tricentis research conducted from 2015-2018 at Global 2000 companies—primarily

across finance, insurance, telecom, retail, and energy sectors.

Testing is the yang to the yin

of software development --

without it, there’s no

possible way to know that

software will meet

requirements.

As agile software

development and delivery

accelerate, it seems obvious

that we can, and should,

conduct continuous

automated testing as much

as possible. But we need to do it strategically, not blindly. Otherwise, bad habits could

4

Whitepaper: Imbalanced Test Automation Hurts Agility

©2020 Intellyx LLC. https://intellyx.com

cause test automation to become counterproductive and actually undermine our

business agility:

§ Tipping the scales away from customer outcomes. Test automation goals

should always be aligned with customer goals: better software functionality,

reliability, security, performance. It is easy to forget to tie business incentives into

every instance of test automation that is introduced and maintained, but without

that alignment, teams are creating fruitless labor and costs, merely for the sake of

checking a box.

§ Feeling a false sense of security. Symptoms of this habit may include claims of

99% or higher levels of ‘test coverage’ created by the execution of hundreds of

thousands of static code checks, unit tests, data comparisons and regressions.

High-volume test procedures are useful for quality control gateways at each phase

of the software delivery lifecycle. But saying ‘our million unit tests passed’ at any one

level doesn’t automatically translate to better user experience – and such statistics

can’t provide more than a fig leaf’s worth of coverage across a complex application

portfolio.

§ Inflexibility to change. If the test strategy isn’t architected for change, then every

new update, component, or contribution makes test automation unusable, test

data invalid, and results hard to reproduce. Brittle tests -- those that can’t survive

changes, especially in agile environments -- produce 60 to 80 percent of the false

positives and negatives seen by testers. When assets are not responsive to change,

teams begin to give up on the wasted effort of repairing existing tests and building

new ones, impacting the organization’s ability to move forward.

§ Test bloat and burn. The reflexive response to imbalanced test automation is

creating more and more of the easy tests, or slight variations of existing ones. Since

the failures of redundant tests are hard to trace back to their source, nearly as

much time is spent re-creating tests, since it is assumed that failures are happening

because the tests are bad, not the application itself.

Test bloat results in higher system costs and cloud opex for running test workloads

and gathering and cleansing test data, which precedes a higher labor burn rate. If

5

Whitepaper: Imbalanced Test Automation Hurts Agility

©2020 Intellyx LLC. https://intellyx.com

integration partners are involved and incentivized to make more tests, they may

burn budget at an alarming rate, while internal testers will experience higher

burnout.

Together, all of these challenges consume lots of costly resources that could be better

invested on the highest value work, which can’t be automated. This erodes the

organization’s confidence in testing over time, creating a huge impact on your ability to

rapidly release software that meets customers’ ever-changing needs.

Fortunately, there are ways to break this vicious cycle, by applying balanced test

automation at the right time, at the right place, and with the right resources.

Testing at the right time

Test early and often. Test first and fail faster. The ability to ‘shift left’ our testing in the

software lifecycle is a core tenet of agile development.

We’ll never get away from final performance testing and acceptance tests on the right

side of the lifecycle, any more than we can dispense with static code analysis and early

peer review on the left side. There’s nothing wrong with comprehensive unit testing, or

API testing, or smoke testing, as long as it is done smartly and strategically. You can

even have very carefully designed UI tests to validate against the most important

business risks the application might face in production.

Still, there is a lot more we can do to weave critical functional, regression, integration,

performance and security testing into every phase of the SDLC to make it a smoother

experience. The problem may not be ‘too much automation,’ so much as making sure

the test automation we do adds business value, avoiding an imbalanced and or

potentially unhealthy situation for any one kind of testing at any one phase of software

development.

From an overall development program and project perspective, define which test

workloads need to be attended, and which can be left unattended (or

programmatically launched and run), with feedback into continuous delivery processes.

Unattended tests must be orchestrated via policies that set boundary conditions for

too many exceptions -- or not enough, periodically -- so teams can be alerted if severe

6

Whitepaper: Imbalanced Test Automation Hurts Agility

©2020 Intellyx LLC. https://intellyx.com

errors are discovered, or if the tests seem to be unnecessary or ineffectively allowing

defects to escape into later phases.

Attended test automation should represent test suites that need to be deliberately

launched or closely monitored by devtest, SRE and security professionals. These

usually occur at phase gates, for instance to check regressions against the existing

codebase at the developer’s pull request, or to conduct performance or chaos tests as

a critical pre-production checkpoint.

EdgeVerve, an Infosys subsidiary, produces a market-leading core banking application.

They faced a timing problem as their dev team transitioned to a SAFe agile

methodology. With an existing library of more than 100K+ test scripts, largely captured

from user interfaces, they were spending 50-75% of their cycle time on test

maintenance, with erratic defect escape ratios.

The teams rethought their entire test suite, with more modular component-level and

API-based tests that could be reused within every phase -- regression, usability,

business flow, interface, and migration. They now catch as many as two-thirds of

defects within each sprint, while 44% of the old test cases were deemed redundant, for

a drastic improvement in delivery speed and quality without additional cost.

Testing at the right place

To balance our own health over the years, we stretch to retain flexibility, and a better

range of motion. In software terms, we also train for flexibility, so our testing will

remain resilient and useful wherever there is risk, at every layer of the application

architecture.

What are your stretch goals for focusing test automation at the

right place?

§ UI test resiliency. Testing at the user interface layer has never been more critical.

However, there is an age-old problem with any attempt to automate UI testing, as

software changes are now witnessed by end users faster than ever. Every update

can cause a correlating change in the UI, breaking many automated tests.

7

Whitepaper: Imbalanced Test Automation Hurts Agility

©2020 Intellyx LLC. https://intellyx.com

Testing needs to become more business context-aware to survive in this mode.

This first requires a change in mindset, so test authors are validating business

outcomes rather than specific on-screen results. Just as importantly, the test

automation technology itself needs to be flexible and intelligent enough to

recognize a range of success and failure conditions.

§ API and microservices layer validation. Focusing exclusively on the client layer

isn’t a sustainable test strategy anyway, given today’s distributed, service-based

architectures. Application logic is now largely defined in the conversation between

services and systems -- whether called from third parties via API, or defined as

modular microservices by the company’s own teams.

This architectural approach provides a real skills challenge to many test teams, but

once they are armed with service-layer testing solutions, the path to reliably

repeatable results becomes much clearer.

§ End-to-End data testing. System and UI tests tend to see data only in terms of

inputs and results, without considering how the data was moved or transformed

within the software workflow.

In order to achieve business context for our testing efforts, we need to be able to

validate data as part of a workflow in its current state, whether in flight or at rest.

rather than only testing for specific results in system or UI testing. Only then can we

truly contain the ‘blast radius’ risk of data that is lost or misinterpreted on its

journey, which can impact other dependent services and downstream work.

§ Reduce scope for higher impact. Enhanced visibility allows engineering leads to

conduct a proper impact analysis. Risk/reward calculations recommend where

automation should be increased to deliver value, using machine learning trained on

normalized metrics from many prior implementation successes and failures.

Let’s say you need to assure quality during an upcoming SAP upgrade to integrate

new cloud services. Using a solution like Tricentis LiveCompare, teams can not only

focus test automation on the high-risk levers that affect critical business decisions,

they can also understand when additional testing will stop adding value.

8

Whitepaper: Imbalanced Test Automation Hurts Agility

©2020 Intellyx LLC. https://intellyx.com

ANZ took a disciplined approach to transforming their bank’s QA organization into an

agile ‘Spotify style” team of small engineering sprint groups, enfranchised to ask all the

right questions about exactly what and where they should be testing.

The company piloted several systems and adopted Tricentis Tosca to run test

automation against their Kafka data lake and mainframe services. With business

context aware testing, they managed to improve from quarterly releases with limited

test coverage for 8 apps, and are now tracking agile releases every 2 weeks, with 70-

90% risk-based test coverage of more than 70 application services, including

microservices and end-to-end testing.

“As we are emitting a lot of information, it is very important that we always

know whether our testing is going in the right direction or not, so we roll up

all of our quality indicators.”

- Jitendra Vysyaraju, ANZ Banking Group, at Tricentis Accelerate 2019

event

Testing with the right resources

The best-performing companies always emphasize improving the productive capacity

of all team members -- through a cultural combination of professional achievement,

organizational design, education, and skill development -- balanced with putting the

procedures, tooling and infrastructure in place to make them successful.

What’s feeding your team’s test automation? These essential ingredients should

always be present to catalyze high-quality, agile software delivery.

§ Fast feedback loops. Achieving near-instant results in response to test runs, and

fast feedback from both simulated environments and live customer usage is

considered the ‘superfood’ of test-driven development. High test responsiveness

allows teams to debug problems or pinpoint root causes for developer issue

resolution, without context switching.

§ Requirements test automation. If you really want to shift testing all the way to

the left, why not start testing the test requirements themselves? This approach may

9

Whitepaper: Imbalanced Test Automation Hurts Agility

©2020 Intellyx LLC. https://intellyx.com

sound absurd, but requirements testing has been an essential design proof point

for highly regulated environments like electronics and defense for decades.

Solutions like Tricentis Tosca allow test requirements to be weighted and prioritized

by the business value and risk, allowing tests with less relevance and impact to be

left off the budget entirely.

§ Self-service automated environments. All of the goodness of software defined

infrastructure as code (IaC) and elastically scaled public/private cloud capacity isn’t

just for software build and deployment teams. The automated setup and teardown

of complete test environments, replenished with golden state test data, drastically

reduces cycle times and overhead costs. Self-service makes all the difference here,

as test teams thrive when they can readily provision their own resources without

having to log an IT support request.

§ Service virtualization for simulation. Sometimes, you need to defy reality, and

settle for virtual environments. Service virtualization (or SV) allows the capture,

configuration and simulation of systems, so you no longer need to access the real

thing. Why could this be better? Not only does SV eliminate availability conflicts over

constrained systems, the virtual environments can more predictably model the ‘to-

be’ state and scenario data, including edge conditions that are hard to reproduce in

the real world, for more consistent test results.

§ Welcome our AI collaborators. Call it machine learning or augmented

intelligence, but we’re starting to see a new class of AI-driven testing that can

visually detect and identify on-screen elements, and understand how the human

user’s interactions with these objects are tied to application logic.

Tricentis introduced NEO (or neural optical engine) to self-heal testing, so functional,

use case and non-functional tests can remain valid and stable without maintenance,

even when the presentation layer changes. There’s no ‘magic’ here -- teams still need

to tell NEO where to look -- but once engaged, this cognitive engine enables

automation to keep pace with software change with fast feedback in line with coding

and testing tools.

10

Whitepaper: Imbalanced Test Automation Hurts Agility

©2020 Intellyx LLC. https://intellyx.com

The Intellyx Take Given the chaos inflicted upon our applications in the real world, over-engineering and

over-automation of testing are only natural human responses.

Test automation doesn’t have to become unhealthy. Smart, strategic automation is

both the best preventative measure, and the best medicine for what ails software

delivery. Properly incentivized test teams with a balanced approach can overcome the

false positives and negatives, and the data and alert exhaust that accompany an ever-

expanding test automation suite.

It shouldn’t matter what development languages or tools are in use, nor what

infrastructure you are delivering into. Nor what stage of development your apps are in.

Achieving a balanced software test automation practice that uses intelligence to focus

on the critical challenges will free up human minds to focus on creative solutions and

critical thinking – resulting in less risk, more output and more real innovation.

11

Whitepaper: Imbalanced Test Automation Hurts Agility

©2020 Intellyx LLC. https://intellyx.com

About the Author Jason “JE” English is Principal Analyst and CMO at Intellyx, a boutique analyst firm

covering digital transformation. His writing is focused on how agile collaboration

between customers, partners and employees can accelerate innovation. He led

marketing efforts for several high-tech firms, including the development, testing and

virtualization software company ITKO, from its bootstrap startup days, through a

successful acquisition by CA in 2011. JE co-authored the book “Service Virtualization:

Reality is Overrated” to capture the then-novel practice of test environment simulation

for agile development, and more than 60 thousand copies are in circulation today.

About Tricentis

Tricentis is the global leader in enterprise continuous testing, widely credited for

reinventing software testing and delivery for DevOps and agile environments. The

Tricentis AI-based, continuous testing platform provides automated testing and real-

time business risk insight across your DevOps pipeline. This enables enterprises to

accelerate their digital transformation by dramatically increasing software release

speed, reducing costs, and improving software quality. Tricentis has been widely

recognized as the leader by all major industry analysts, including being named the

leader in Gartner’s Magic Quadrant five years in a row. Tricentis has more than 1,800

customers, including the largest brands in the world, such as Coca-Cola, Nationwide

Insurance, Allianz, Telstra, Dolby, RBS, and Zappos.

©2020 Intellyx, LLC. Intellyx retains editorial control over this document. At the time of

writing, Tricentis is an Intellyx customer. Intellyx publishes the weekly Cortex and Brain

Candy newsletters, and the Cloud-Native Computing Poster. Image credits: Dierk Schaefer,

‘Balance-Akt’, flickr (CC2.0), DevOps Review (chart from “Enterprise Continuous Testing” book

2019, Platz, Dunlop).