Ad-hoc Testing: How to Find Defects Without a Formal Testing Process

The very term ad-hoc implies the lack of structure or something that is not methodical. When you talk about ad-hoc testing, it means that it is a form of a black box or behavioural testing performed without any formal process in place.

The formal process here means having the documentation like requirement documents, test plan, test cases and proper test planning in terms of its schedule and order of performed tests. Also, any actions performed during the testing are not typically documented.

This is mainly done with the aim of trying to uncover defects or flaws which cannot be captured through traditional or formal processes followed during the testing cycle. 

As already understood, the essence of this testing lies in not having a formal or structured way of testing. When such kind of random testing techniques is performed, it is apparent that the testers perform this without any particular use case in mind with the aim to break the system.

Hence it definitely is even more obvious that such intuitive or creative testing methodology requires the tester to be extremely skilled, capable and have in-depth know-how of the system. Ad-hoc testing ensures that the testing performed is complete and is particularly very useful in determining the effectiveness of the test bucket.

Recommended reading => Exploratory Testing – How to Think Beyond Traditional Testing Boundaries?

What You Will Learn:

Let’s start with an Ad-hoc testing example: 

Here is an example of how we can perform this testing when it comes to UI Wizard.

Let’s say you need to create a plan or a template for some kind of task to be performed using this UI wizard. The wizard is a series of panes which has user input like name, description, etc. As the wizard progresses: say on one of the panes, user data is to be entered which involves the UI wizard to throw a context pop-up box which adds the associated data to complete the wizard and deploy/activate the wizard.

To test this tester does his regular testing such as:

Now, for the above example here are some test cases for ad-hoc tests that could be performed to uncover as many defects as possible:

Characteristics of ad-hoc testing:

If you note the scenarios above, you will see something very distinct characteristics about this type of testing.

They are:

When do we do ad-hoc testing?

A million dollar question indeed!

Most of the time’s test teams are always burdened with too many features to test within limited timelines. In that limited time-span, there are lots of testing activities that are derived from the formal process that must also complete. In such situations ad-hoc testing finding its way into the testing is slim. However, from my experience, one round of ad-hoc testing can do wonders to the product quality and raise many design questions. 

Since ad-hoc testing is more of a “wild-child” testing technique that doesn’t have to be structured, the general recommendation is that it must be performed after the execution of the current test bucket is done. Another point of view is that this could be done when detailed testing cannot be performed due to less time.

In my view, I would say that ad-hoc testing can be done almost any time – in the beginning, towards the middle and towards the end! It just finds its place at any given time. However, when ad-hoc testing must be done to bring out maximum value is best judged by an experienced tester who has in-depth knowledge about the system being tested. 

When not to execute?

If the previous question was worth a million dollars, this should be worth a billion!

While we’ve established how effective and fruitful ad-hoc testing can be, as a skilled and capable tester we also need to decipher when not to invest in this type of testing. Although it is at the discretion of the tester, here are some recommendations/examples when it might not be necessary. 

Types of Ad-hoc testing:

Ad-hoc testing can be categorized into three categories below: 

#1. Buddy testing:

In this form of testing, there will be a test member and a development member that will be chosen to work on the same module. Just after the developer completes the unit testing, the tester and developer sit together and work on the module. This kind of testing enables the feature to be viewed in a broader scope for both parties. The developer will gain a perspective of all the different of tests the tester runs and the tester will gain a perspective of how the inherent design is which will help him avoid designing invalid scenarios, thereby preventing invalid defects. It will help one think like think the other.

#2. Pair testing:

In this testing, two testers work together on a module with the same test setup shared between them. The idea behind this form of testing to have the two testers brainstorm ideas and methods to have a number of defects. Both can share the work of testing and make necessary documentation of all observations made.

#3. Monkey testing:

This testing is mainly performed at the unit testing level. The tester parses data or tests in a completely random way to ensure that the system is able to withstand any crashes. This testing can be further classified into two categories:

Ad-hoc testing benefits:

It testing warrants the tester with a lot of power to be as creative as necessary.

This increases the testing quality and efficiency as below:

Ad-hoc testing drawbacks:

Ad-hoc testing also has a few drawbacks. Let’s take a look at some of the drawbacks that are pronounced:

Since it’s not very organized and there is no documentation mandated, the most evident problem is that the tester has to remember and recollect all the details of the ad-hoc scenarios in memory. This can be even more challenging especially in scenarios where there is a lot of interaction between different components.

Best practices to make this testing more effective:

We’ve discussed at length the strengths and weaknesses associated with this testing.

Ideally, ad-hoc testing should find its place in the SDLC, however, if not approached in the appropriate manner it can prove to be costly and a waste of valuable testing time. So given below are a few pointers to make ad-hoc testing effective:

#1. Identify Defect prone areas:

When you have a good hold over testing a particular piece of software, you will agree that there will be certain features which are more prone to errors than the others. If you’re new to the system, then go ahead and check the features v/s defects opened against them. The number of defects in a particular feature is will show you that it’s sensitive and you should precisely choose that very area to perform ad-hoc testing. This proves to be a very time efficient way of exposing some serious defects.

#2. Building expertise:

Undoubtedly, a tester who has more experience is more intuitive and can guess where the errors might be, when compared to someone who has not much experience. I would say, experienced or not, it’s up to the individual to take the plunge and build expertise on the system that is being tested. Yes, experienced testers have an edge as their skills built up over the years come in handy, but the new testers should use this as a platform to gain as much knowledge as possible to design better ad-hoc scenarios.

#3. Create test categories:

Once you are aware of the list of features to be tested, set aside a few minutes to decide how you would categorize those features and test. For example, you should decide to test features that are most visible and most commonly used by the user before anything else, as these would seem critical to the software’s success. Then you could categorize them functionality/ priority wise and test them segment by segment.

Another example where this is particularly very important is if there is integration between components or modules. In these cases, there can be a lot of abnormalities that can occur. Using categorization would help touch upon this kind of tests at least once or twice.

#4. Have a rough plan:

Yes, yes this point might confuse you a bit as we described ad-hoc testing as testing which should have no planning or documentation. The idea here is to stick to the essence of ad-hoc testing, but still, have some rough pointers jotted down on how you plan to test.

A very basic example is that sometimes you may just not be able to remember all the tests you intend to perform. So jotting them down would ensure you don’t miss out on anything.

#5. Tools:

Let’s take an example faced very commonly by all of us. A lot of times if you observe, the testing of the functionality in itself is successful with no discrepancy reported in its behaviour. However, the logs behind the scenes could be reporting some exceptions seen which could be missed by the testers as it doesn’t hamper the test objective in any way. These could be even high in severity. Hence it’s very important for us to learn and tools which will help pinpoint this immediately.

#6. Document for more defects:

Again, I understand that this may raise some eyebrows again. Documentation doesn’t have to be detailed, but just a small note for your own reference of all the different scenarios covered, deviation in steps involved and record those defects for the particular test feature category. This will help you improve the overall test bucket as well whereby you could decide how to improvise existing test cases or add more if necessary.


We’ve discussed in detail about ad-hoc testing techniques- its strengths, weaknesses, situations where it would and would not be beneficial.

This is one testing technique that guarantees to cater to and satisfy a tester’s creativity to the maximum. In my entire test career, I gain utmost satisfaction from ad-hoc testing as there’s no limit to being innovating and you only end up being more knowledgeable.

Having said that, the main thing to take back from all the above information would be to determine how to tap into ad hoc testing strengths and make it add value to the overall test process and product quality. 

About the author: This is a guest article by Sneha Nadig. She is working as a Test lead with over 7 years of experience in manual and automation testing projects.

Do you perform ad-hoc testing on your project? What are your suggestions to make ad-hoc testing successful?