During my 12 years career in software testing, I have the privilege of working with different development methodologies. I have witnessed the transformation from Waterfall to Agile and have seen how the expectation, roles, and skills of testing professionals have taken a huge leap.
QAs are now evolving themselves from being just a “Bug Finder” to a “Bug Preventer”. They are acquiring new skills like Automation, TDD, BDD, and White box testing, not to mention Black box testing. They are now more solution-oriented, collaborate more with the development team and business stakeholders.
I sincerely believe in continuous learning and utilizing a platform (like softwaretestinghelp.com) to share my experiences and learn new concepts and tools.
So here I am again to share my knowledge and experience as a “Tester in Agile” and to hear from all you guys about your thoughts and opinions.
“Agile Tester – Change in Mindset” is not something that I can communicate effectively in just one section. Therefore, I plan to break my article into 3 knowledge areas:
- Aligning the Agile tester with the Agile manifesto
- Involvement of testers in TDD, BDD, and ATDD
- Implementing automation in Agile
What You Will Learn:
- Aligning the Agile Tester with the Agile Manifesto
Aligning the Agile Tester with the Agile Manifesto
The term Agile means “Flexible”, “able to move quickly”.
Agile testing is NOT a new technique of testing, rather being Agile means to develop a change in the mindset of delivering a testable piece.
Before we discuss more into Agile testing, let’s flashback and try to understand the origin and the philosophy behind Agile.
The old story
Before the world moved to Agile, Waterfall was the dominant methodology in the software industry. I am not trying to explain the Waterfall model here but am noting the pointers on some of the practices followed by the team on implementing a particular feature.
All these pointers are based on my experience and may have a disparity in opinion.
So here we go…
- Developers and QAs worked as separate teams (Sometimes as Rivals :) )
- The same required document was referred to by developers and QAs simultaneously. Developers did their designing and coding and QAs did their test case writing referring to the same required document. The planning and execution were done in silos
- The test case reviews were solely done by the QA leaders. Sharing the QA test cases with the developers was not considered a good practice. (Reason – The developers would code based on the test cases and the QAs would lose on the defects )
- Testing was considered the LAST activity of an implementation cycle. Most of the time, QAs would get the feature in the last stage and were expected to complete the entire testing in a very limited time. (And the QAs did it)
- The only goal of the QAs was to identify bugs & defects. The overall performance of the QAs was judged based on the number of valid bugs/defects they submit
- STLC and Defect lifecycle was followed while in execution. Email communication was preferred
- Automation was considered an end activity and mostly focused on the UI. Regression suit was considered the best candidate for Automation
Yes, following these practices did have its drawbacks:
- Because the teams worked in silos, the only medium of communication between the developers and QAs were “Bugs & defects”
- The QAs only scope was writing and executing test cases on the finished product
- There was very little to no scope for the QAs to view the code or interact with the developers or business
- Because the entire chunk of the product is released at once, there was a huge responsibility on the QAs during the time of production. QAs were generally considered quality gatekeepers, and if anything went wrong in the production, the entire blame was put on QAs
- Apart from the functional testing, Regression testing of the entire product was also an additional responsibility of the QAs which comprised of a huge number of test cases
Out of all the drawbacks, the major disadvantage was that of “Loss of focus from the ultimate goal of delivering a good quality product at a sustainable pace”.
The ultimate goal of the team (developers + testers) is to deliver good quality software that would meet the customer requirements and is fit for use. Because of the big time span and increased time to market duration, the focus blurred, and the only objective that sustained was to finish the implementation and move the code to UAT.
QAs concentrated only on the test case execution (put a tick mark against the test case checklist), make sure the bugs/defects were closed or deferred, and move on to different project/module. So, from QAs perspective, the focus was not on speed and quality of delivery but to complete the test case execution (and of course some automation).
The Agile Philosophy
It all started on early 2001, when a group of 17 professional met at Utah (USA) to ski, eat, relax and have a quality discussion; what came out was the Agile manifesto.
As a Quality professional, it is imperial that we too understand the essence of the manifesto and try to shape our thought process accordingly:
Let’s try to align our thought process of testing the software with the agile manifesto, but before I do that, let’s understand one thing: in Agile, the teams are cross-skilled and everybody in the team is contributing towards the development of the product/feature.
Therefore, I prefer to call the entire team the “Development team”, which includes programmers, testers, and business analysts. Henceforth, I would be using the terms as: Developers Programmers & QA Testers.
#1) Working software OVER Comprehensive documentation
The ultimate goal in Agile development is to deliver potentially shippable software/increments in a short period of time, which means that the time to market is the key. Having said that does not mean that quality is at stake. Because the time to market is less, it is important that the test strategy and planning of the execution be more focused on the overall quality of the system.
Testing is an endless job and it can go on and on, testers have to determine certain parameters of which they can give a green signal for the product to be moved to production. To do so, it’s imperial that testers equally involve themselves while deciding the “Definition of ready (DOR) and Definition of Done (DOD)” and not forget to decide the “Acceptance Criteria of the story”.
The test scenarios and test cases should be revolved around the definition of Done and Acceptance Criteria of the story. Instead of writing exhaustive test cases and including that information in the test cases which are seldom used, the focus should be more on the precise and to the point scenarios. I have used the below template to write my test cases.
The point here is to include only the information in the test scenarios/cases that is required and adds value to the cases.
#2) Customer collaboration OVER contract negotiation
Let’s have direct communication with the customer regarding our testing approach and try to be transparent in sharing the test cases, data and results.
Have frequent feedback sessions with the customer and share the test results. Ask the customers if they are good with the tests results or they want any specific scenarios to be covered. Let’s not try to restrict ourselves to asking questions and seeking clarification from only the product owner/business to understand the functionality and business.
The deeper understanding of the feature we have, the more precise coverage we have on the testing.
#3) Responding to change OVER following a plan
Only one thing which is Constant is Change!!
We cannot control change and we have to understand and accept the fact that there will be changes in the feature and requirements; we have to adapt and implement.
The frequent change in requirements are very well adopted in Agile, hence in a similar fashion, as testers, we need to keep our test plan and the scenarios flexible enough to accommodate new changes.
Traditionally, we create a test plan and the same is followed throughout the lifecycle of the project. Instead, in Agile, the plan has to be dynamic in nature and molded as per the requirements. Again, the focus should be on meeting the Definition of Done and the Acceptance criteria of the story.
I don’t see a need to create a test plan for every story; instead, we can create a test plan at an epic level. Just like epics are written and being worked upon, simultaneous efforts can be put on the creation of the Test Plans for the same. There may or may not be a defined template for it. Just make sure we have the coverage of the quality aspect of the Epic entirely.
Try to utilize the PI (Product Increment) planning days to determine the high-level test scenarios for the story based on the definition of done and acceptance criteria.
#4) Communication & collaboration OVER process & tools
The Testers are very much process oriented (which is perfectly fine), but we should keep in mind that in lieu of following a process, the turnaround time/response time for the issue is not impacted.
In case of a co-located team, any issues can be resolved by direct communication. Perhaps we have the daily stand-ups which provide a good platform to resolve issues. It’s important to log a defect, but it should be done only for tracking purposes.
Testers should pair up with the programmers and collaborate to resolve the defect. If needed, product owners can also be pulled in. Testers should actively and proactively participate in the TDD (Test Driven Development) and should collaborate with the programmer team to share the scenarios and try to identify the defects at the unit level itself.
“Testing in Agile” is not a different technique, rather it’s a change in the mindset, and change does not happen overnight. It requires knowledge, skill and proper mentoring.
In the second part of the series, I will discuss testers involvement in Test-driven development (TDD), Behavior-Driven Development (BDD), and Acceptance Test Driven Development (ATDD) in Agile.
About the author: This article is by STH Author Shilpa. She is working in software testing field for the past 10+ years in domains like Internet advertising, Investment Banking, and Telecom.
Keep watching this space for more.