How to Test Smarter?
We can help testers do a better job — a smarter job — by encouraging them to refocus some of their efforts away from test documentation and use that time instead to expand exploratory and ad-hoc testing.
One of the biggest problems facing IT organizations today is the compressed development cycle. Except for huge enterprise initiatives, almost nobody plans for cycles that are longer than a couple of months. Managers have to deal with short cycles and get quality software out the door. Coders have to write software faster. And testers have to accommodate new and changing functionality on ever-more-complex software without letting defects slip through undetected.
If teams must create more software in less time, something has to give in order to improve efficiency. Yes, better SDLC (software development lifecycle) tools certainly can help improve programmer and tester productivity. Yes, automated test technology is incredibly important to both agile and traditional teams.
Even with great development tools, agile processes, continuous integration and test automation systems, quality assurance is a serious bottleneck as business demands ratchet up the pressure and compress the deadlines.
That means testing smarter. Reallocating resources from creating documentation and instead focusing on adding value with exploratory testing.
Reducing documentation doesn’t throw quality out the window. Not at all. The test scripts are still created and still run. Testing smarter requires that developers and testers plan, build and run essential unit tests, functional tests, acceptance tests and security tests.
However, smarter testing does mean acknowledging that some tests are more important than others, and as such should receive more attention, including documentation.
Consider a traditional agile team tasked with adding new functionality to a website or mobile application. Early in the sprint, the team would create a test plan and either create new test cases or modify existing ones. Once the coding is done, the team would run the tests and document the execution results and defects. If there are defects, the code would be corrected, and the tests rerun. In some cases, the defects might require the agile team to reexamine the test cases, as well as the code, and potentially update them as well. Rerun tests. Repeat.
Creating and updating test cases takes time and resources. So does the process of documenting the test cases and each of the test runs (though automation helps).
Most test documentation adds no value to the business. If the team tests smarter, testers can focus on writing up tests runs if and only if defects appear, instead of documenting every test case and test run. If the test run’s results are negative (i.e., no defects), then you must move on. If the results are positive (i.e., defects appeared), then yes, testers should document the test, including everything needed to reproduce the defect.
Imagine there are 100 new test cases for a particular sprint. That’s 100 test cases that must be examined, possibly updated and thoroughly documented.
How to Test Smarter? Let’s test smart: Say that it’s determined that 10 of those test cases need to be carried forward for future regression testing. Perhaps another of the 15 tests failed during execution by producing unexpected or undesired results. If the team needs to document only those 25 key and failed test cases — not all 100 — think about the time savings.
Use that freed-up time to improve quality by encouraging developers, testers and other stakeholders to do more exploratory, ad-hoc type of testing. If the team is fortunate enough to have test-automation tools that can turn ad-hoc tests into reusable test scripts for future regression tests, that’s a bonus, since exploratory tests can be turned into test-case workflows.
Make no mistake: Before development teams decide to test smarter, and stop documenting certain tests, it is essential to ensure that the testers truly understand the goals of a particular development project or phase, and therefore which new tests won’t be needed for future sprints.
In agile shops, that means knowing the objective of each sprint. Understand what’s new or changing in that sprint and in the backlog. Understand the user stories. Agree which tests are only needed in that one sprint (and thus don’t need to be documented) and which tests are needed for future regression testing and acceptance testing (and thus should be thoroughly documented).
Ask yourself, “When the end user receives this sprint’s code, what would he/she be most interested in?” Obviously you need to test there, and document those tests. However, also ask, “What parts of the code would the end user probably not be thinking about, but where he/she could find problems?” Those questions will guide developers, testers and other stakeholders toward edge cases and situations that cry out for exploratory and ad-hoc testing.
The team leaders should envision a high-level approach for what should be tested. There will be key scenarios of each sprint that need to be tested and re-tested because they are highly vulnerable or foundational for future sprints. Once those are identified, those scenarios can be packaged for future regression testing. By contrast, code areas that are not high risk can be tested once — and not used for regression testing, especially if that code is stable and is not affected by future feature enhancements. Therefore, no documentation is required.
We are all under pressure to deliver more code faster. To accelerate software development without sacrificing quality, test smarter!
Use test automation whenever possible, and continue executing unit tests as new code is checked into the source-code management system. Document and run regression tests on critical code, of course, but don’t waste time documenting tests that won’t be needed in the future. Instead, use your testing resources for exploratory testing. That improves quality – and accelerates the development lifecycle.
About the Author
Vu Lam is CEO and founder of QASymphony, a developer of defect capture tools that track user interactions with apps. He was previously with First Consulting Group and was an early pioneer in Vietnam’s offshore IT services industry since 1995. He holds an MS degree in electrical engineering from Purdue University. You can reach him at: email@example.com.
Let’s know your thoughts, questions/comments.