Agile Testing vs Exploratory Testing

This month I will be doing two sets of training sessions - one for Agile Testing (AT) and one for Exploratory Testing (ET). I have extensive experience with both and I enjoy teaching and coaching both activities. This is the first time that I have been asked to do both training sessions at basically the same time, so I have been giving the relationship between the two some thought.

Over the past decade I have sometimes struggled with the difference between the two. I admit that I have on occasion even described them as being the same thing. Having given it some thought, I now think differently.

The insight came to me when I had a conversation with someone about both topics recently. The courses I'm doing have different audiences, different outlines, and different expected behaviours/outcomes. Yes, there is some overlap, but not much.

I have written previously about what I believe is ET (it's an interesting post - I suggest you take a moment to read it if you haven't already). Near the end of that article, I mention Agile Software Development and the Agile Testing Quadrants, so there is some relationship between the two.

ET is sometimes described as an exemplar of the Context-Driven Software Testing community. If you read through the seven basic principles of the Context-Driven Testing (CDT) School you may see where there are similarities in the philosophies between it and the values & principles listed in the Agile manifesto. Things like:
  • CDT: People work together to solve the problem
    • Individuals and interactions over processes and tools
    • ...face-to-face conversation
    • Business people and developers work together daily throughout the project
  • CDT: Projects are often unpredictable
    • Respond to change
    • Welcome changing requirements, even late in development...
    • Deliver working software frequently...
  •  CDT: No best practices
    • Continuous attention to technical excellence and good design ...
    • The best architectures, requirements, and designs emerge from self-organizing teams ...
And we could probably make a few more connections between Agile and the CDT principles.

So what are some of the differences?

For a start, I learned and practiced ET long before the Agile Manifesto was written. That tells me that I am not necessarily doing AT any time I am doing ET. Let me say that again: if you are doing ET, that's great, but it doesn't mean you or your team are agile.

ET is something you can choose to do when you want to say "yes" to thoughtful, mindful testing that takes into account not only the people and the context involved in the project, but also the desire to provide meaningful qualitative and quantitative feedback that cannot [presently] be automated by a computer.

Allow me to illustrate an example of an ET effort compared to a standard test approach on a given project.

Let's say a particular "test case" has a step that asks you answer the question: "What time is it?"

A) A standard/traditional or automated test result might simply be:
  • 5:30 pm
... and then proceed onto the next step of the test case, the next test case, or whatever. "Moving on" is the point and you would have done it long before the start of this sentence...

B) An exploratory tester may provide an answer along these lines:
  • 5:30 pm ... or do you want "PM"?
  • 17:30
  • Do we need seconds? Do we need fractions of seconds?
  • 5:30:42 PM EST on Thursday, 12 February 2014.
  • Wait, do we care about Date/Time stamp, or just the time?
  • What if I have a sundial? Is an approximation good enough?
  • Sunset, dusk
  • "It's Tiny Talent Time" (okay, that dates me ;))
  • It's Miller time. (or whatever other beer brand you may prefer)
  • Hammer Time!! (then bust a move..)
  • Winter time (or other appropriate season)
  • Dinner time.
  • Banana time. (aside: cheers to the old EA Tools team! ;))
  • Overtime. Are we getting overtime pay? Is someone ordering food? ...
  • Hm, how will this input be used in the system? Are there some boundary conditions I can play with that will expose potential downstream failures?
  • What kind of input validation exists on this input field?
  • Can I enter anything into this field? Can I try some constraint attacks, XSS or SQLi inputs?
  • Is this a required field? What if I skip it completely?
  • Is there any user documentation, marketing material, or online help that provides guidance on how I should answer this question?
  • ...
... and more responses than time and space permits me from listing here and now. Now some of you may think that many of these responses are "silly" or "inappropriate" and to those of you I ask: "In what context was the question asked? Are you sure? Could you be wrong? Does everyone on the project team have the same understanding of the question as what the users expect or need?"

We have doubt and so any of the above responses may be valid in one context or another. Without further investigation we cannot be sure which subset of the above responses will help us discover interesting things about: (1) the system, (2) the people using it, or (3) the problem we are trying to solve with the given product or functionality.

You can also see how a tremendous amount of questions and information may be generated by a single exploratory tester. This blows up really fast and will likely slow down your progress through any project if you pause to do this with every single field, function or feature. (NOTE: I'm not trying to discourage you from doing ET here; I want to help you set the right expectation by understanding this reality.)

The point here is that I am describing a process of discovery and exploration, a process of learning. This is an individual's story. Testers usually/often (but not always) work alone, especially in pre-Agile days and even now on large-scale Waterfall-type/outsourced projects.

Doing ET well requires skill and practice. There are numerous models, heuristics, techniques and tools that you need to become comfortable with. [Product/System/User/Problem/Industry/Business] Knowledge comes from effective ET practice. Improved shared knowledge and understanding among the project team members is a marker of a good ET practitioner, but that's not always the case depending the team and project dynamics.

So what about Agile teams or Agile Testing (AT)?

For a start, any tester who finds him/herself alone, sitting at a computer and trying to understand the context of a feature or product, just remember: YOU ARE NOT IN AN AGILE TEAM. (This situation is a clear symptom that your team needs a good agile coach.) Exploratory Testing is a crutch in this case, a kludge. It is way more helpful to both you and your team to use your brains (i.e. to use ET) to provide good feedback quickly, however, [structured] guesswork is a stupid and inefficient way to get by. And that goes against the intention of the Agile Manifesto.


When I teach or coach agile teams on how to deliver value more rapidly, I don't teach ET. I get the whole team together and I ask them to work together to develop shared understanding of what they want to deliver. This may include different activities, such as:
  • Lean Startup (i.e. product/feature hypothesis and assumption-checking activities)
  • Story Mapping
  • Specification By Example (what I consider to be the exemplar of Agile Testing)
  • Team pairing: 
    • Product Owner (or Business Analyst, customer...) & Developer (i.e. one or more of [designer, programmer, tester, tech writer] )
    • Dev & Dev (i.e. select one of [designer, programmer, tester, tech writer] and add one more from that same set, even the same role)
  • Sprint/Iteration Planning - Definition of Done (and "Quality") for the deliverables
  • Sprint Demonstration (i.e. take your completed, working code and give it to your customer to play with)
  • and more...
Somewhere in that above list is a Tester. A tester who understands ET will make better contributions than a more traditional-minded tester. That is a fact. I am often asked to help coach QA/Test teams who are excluded from "agile" development teams because they don't know how to give up their standard test documentation or processes. You don't have to be an ET practitioner though -- anyone with an open mind who is willing to adapt and learn new collaborative techniques will likely fit in.

To those [traditional] testers who are worried that they don't know how to fit into an agile team, here is a quick test for you. Please look at the Agile Manifesto values and ask yourself one question: Which items do you value more - the left or the right?

If it's the right (i.e. processes, tools, plans, comprehensive documentation), then, yes, you should update your résumé and seek out new opportunities where you will be happier. I will coach the remaining development team members on how to replace your testing/checking contributions with automated test scripts integrated into their build process. (The truth hurts. Agile isn't for everyone. Accept it.)

If it's the left (i.e. people, working software, responding to change), then you are open to learning new ways to interact with your development team members to provide value to the project. This requires courage. We are asking you to adapt to a new role, maybe more than any other development team member. A good starting point for you that may answer some questions during the transition is in Crispin & Gregory's book Agile Testing.

There's more to AT than what you will find in that book though. Joining an agile team is making a commitment to learning. Practicing ET also requires a commitment to learning. They have this aspect in common. ET focusses on your [individual] learning efforts and what you can do to help provide great, timely feedback to the decision-makers. AT focusses on your ability to facilitate learning and understanding among all the team members to ensure that everyone is on the same page when you deliver working software to the customer.

Yes, I believe that knowing ET will help make a better Agile Tester. You need to know more than just ET though. You can also learn to be a great Agile Tester without formally learning ET. A good agile team will provide you with the feedback you need to help you grow.

What do you think? Does this help clarify the differences and similarities?


  1. Hi Paul,

    Interesting blog post. But it confuses me...

    To me agile testing is testing in an agile context. Exploratory testing is a style that can be done in an agile context but also can be done in ANY other context.

    Testing in an agile context can include many things: scripted testing, automated checking, bug hunts, exploratory testing, Specification by example (or ATDD or BDD), TDD, etc.

    I am not sure what you mean by "Specification By Example (what I consider to be the exemplar of Agile Testing)". To me SBE is "just" a technique. I happen to like the technique very much, but testing in an agile context is way more than SBE.

    You seem to try to make the point that agile testing is team work and exploratory testing (or at least the learning) is done individual. And I do not agree at all. Learning can be a group thing. Exploratory testing can be a group thing: using an dashboard with the charters on it, discussing what testing should be done next, fits perfectly in an agile context and is team work. Paired exploratory testing is an excellent example of team work. It is all about sharing knowledge and learning together.

    Programmers can create code on their own. When they do, they learn during this process but work individually on the code. Does this make it non-agile?

    More about how I think about agile testing be can found in my blog post: What makes agile testing different? (

    1. Hi Huib, thanks for the comment and the link to your post on agile testing. I like the post.

      I agree that "agile testing is testing in an agile context" and that's how I usually describe it when coaching (and in my "Pitfalls" workshop). I also agree that the activities you listed may be performed in an agile context, and more. The Agile Testing Quadrants provides a nice way to think of many different testing/checking activities that may be performed on a given project. Individually, the testing activities themselves don't make you agile.

      When I say that "SBE is an exemplar of Agile Testing" I am referring to the Specification Workshop in particular, not the test case-like scenarios that are often automated for checking purposes. The Specification Workshop brings together the "three amigos" (Product Owner, Dev & Tester) to come up with examples that provide a shared understanding of the system. The gherkin notation scenarios (or Fitness tables, or whatever) are a happy by-product of the meeting.

      To me, the Spec workshop is the epitome of efficiency in Testing. You have the key team members together designing and testing the system in real-time before any code is written! I have often heard testers lament "testers should be involved sooner in the project, like when the requirements are discussed." Well, this is it. The Spec workshop puts Testing in the forefront of the requirements discussion. Every good tester should learn how to facilitate one of these meetings.

      I grant you that ET _may_ be done as a group, however, I have never seen it done that way (so far). I'd like to see at least two data points (i.e. companies/teams where this is practiced) before I change my language to say it is more common practice. I usually do paired ET as a form of coaching, to help testers learn how to do it. I encourage tester pairing (and know at least one other person who has advocated it in a past company I worked at) but it rarely sticks. Testers tend to work alone.

      I agree with how you describe how ET may be done with dashboards and daily touch-points, however, information radiators aren't a part of ET. I can't imagine doing ET without one, but they are mutually exclusive. That's one of the primary weaknesses of ET - it takes place in your head. An information radiator (like a testing dashboard) is one way to try and summarise the key points worth sharing with your team or other stakeholders. Info radiators _complement_ ET but aren't required.

      When programmers create code on their own without XP, pairing or TDD, yes, they are being non-agile (I reference Agile Principles #3 and #7-10). When designers design in isolation without external feedback, yes, they are being non-agile. Ditto for tech writers. I generally find that designers and tech writers are the most agile in how they work though. They have generally matured as a profession more quickly than programming or testing... although it is possible to irrationally work in a vacuum and I have seen that happen.

      So, is it possible to do ET in a way that aligns with AT? Yes! Does that always happen? No! Unfortunately.

      I like how you think about this topic, Huib. I believe we are on the same page. Based on what I have seen most companies doing, I can't take the one [agile] scenario you described of how it's possible to do ET as the one true way that it is always done. That doesn't match the reality I see today. There is a bit of overlap between ET and AT in the Venn circles, but not much.


  2. Hi Paul,

    I finally found a moment for a short reply.

    You mentioned: "For a start, I learned and practiced ET long before the Agile Manifesto was written. That tells me that I am not necessarily doing AT any time I am doing ET."

    Surely agile software development existed before the Agile Manifesto was written? For example Jerry Weinberg wrote on Agile Impressions how they were doing incremental development already on 1957 at IBM and how this was, as far as he could tell, indistinguishable from XP. Or how they formed, most probably, first testing group on Project Mercury early 1960's. The testing group on Project Mercury was composed of skillful programmers and worked rest of the development team along the project, from day one. Depending on your definition of Agile Testing, I think glimpses of that was seen 50 years ago.

    If we dismiss the historical references, the latter sentence on your quotation is hard to tackle unless I understand better how you define Agile Testing. How do you define it by the way? (In a sentence or two) Or Exploratory Testing? I for example define Exploratory Testing as "Approach to testing that emphasizes testers ability and responsibility to explore an unknown object or space through concurrent test design and test execution."

    Perhaps even more important is WHY do you think it's useful to have separate definition for Agile Testing?

    I personally think exploratory testing is part of basically all software development projects. It varies a lot though how consciously it is used. I've been participating to workshops in my current project, where we're discussing with business, what features should be implemented and if the suggested piece of software is even giving business value. Next week I'm participating technical documentation reviewing meetings and later sprint planning. In all of these I do exploratory testing. I listen and observe the information. I come up with questions "What if we add this 30 character long email address to this field in this system X?" or "In what kind of physical environments users are using this product?" Then I evaluate the answer and modify my next question based on the answer I've received. Or ask later when I have thought about it for a while. Responsibility steps in when I try to optimize my ability to ask those questions and having perhaps better base knowledge about the subject before attending to meetings. I do think that's exploratory testing and it is part of working with a team. When those questions get asked when we're having face-to-face meetings, others hear them and often have to react to them, if they're inclined to answer. That makes it more team's story than individual's story.

    You also mentioned: "Yes, I believe that knowing ET will help make a better Agile Tester. You need to know more than just ET though. You can also learn to be a great Agile Tester without formally learning ET. A good agile team will provide you with the feedback you need to help you grow."

    Needing to know more than just ET depends on how we restrict our understanding of ET. For me part of ET is collaborating with the team through those conversations I mentioned earlier and providing as much visibility as possible to my process of learning.

    What do you mean by "without formally learning ET"? If you're learning to be a great Agile Tester without formally learning ET --> What are you actually learning then?

    I'm glad you wrote this post as it reminded me of skimming that Agile Testing book of Crispin & Gregory.


  3. Thanks for this posting. This blog provides best method of software testing. It is very informative tips for software testing

  4. Nice post. This blog is very helpful for beginners and professionals. I like it very much.

  5. Agile testing tends to have all-automated tests with short feedback between test execution and test result and it is a software testing practice that follows the principles of agile software development. Informative and helpful post!!!

  6. Thanks for sharing!keep it the same

  7. Interesting discussion between you and Huib, Paul. We all benefit when two mindful testers discuss the philosophy/theory behind what we do. I wish I had found your site sooner. Thanks!

    You mentioned wanting more empirical evidence to support Huib's comments, as any good CDT tester would. So I thought I'd offer mine. I can't name the Fortune 500 company I work for, but I've built up over the past few years (with help from some other key individuals) what I would consider to be an ET team that practices collaboratively, as you mentioned not seeing in the wild.

    We learn together, we test together, we review/introspect together. I would not hesitate to say that we are agile in that respect. But I believe it's because we practice Context-Driven Testing, specifically that 7th principle Huib mentioned in his post: "judgement and skill ... exercised cooperatively." So I think anyone successfully embracing the context-driven testing principles accomplishes that piece you feel is missing to make them agile.

    For me, it seems the biggest difference between agile testing and ET testing in a CDT context is the feedback loop. I agree with you that agile testers should be embedded with the development team. ET testers are typically on their own team, or they are "teamless" floaters. I say "teamless" because working in an agile team has a required social aspect that is difficult to be part of when you float in and out of the team on a regular basis.

    CDT ET testers (I'll try to think of some more acronyms to throw in front of that) are always trying to shorten that feedback loop with development and you can't get any shorter then being directly on the development team. I would even go so far as to say that the next evolution of our team would be to start embedding them with various development teams throughout our company.

    I hope that info is useful. I'd be interested to hear what your take is on the feedback loop.



    1. Hello Zaed, thank you for taking the time to share your thoughts and experiences. I am always happy to learn of more places that are practicing collaborative ET. It sounds like you are beginning to straddle the fuzzy line between Waterfall and Agile development.

      Here's the point: Agile Testing doesn't really exist. It's not really a thing. The point of Agile software development is to deliver things of value to your customers, and to do so in a way that expects things to change over time. We are not talking about made up silos like: Agile Technical Writing, Agile Programming, Agile Design, Agile Testing, and so on. That is a Waterfall-view of Agile. *Agile Development is the whole picture and it is a team sport - one in which your customers are part of the team.*

      There is a mindset change.

      *** If you are thinking to yourself "How can I be a better Agile Tester?" you are doing it wrong! ***

      So, why do I talk about Agile Testing here? Because I want to reach an audience that might otherwise ignore the message here. That is, if I wrote about ET vs Agile Development, I would hear the complaints that I'm not talking about the same thing.. apples and oranges. Yes. That is in fact what we are talking about here.

      A better analogy might be: testing in an agile context is like the human circulatory system. It is an integrated part of the whole system.

      Above, you mention "an agile team has a required social aspect that is difficult to be part of when you float in and out of the team on a regular basis." That's really a key difference.

      Regarding your last question, the length of the feedback loop may be interpreted in different ways. In *Lean* terms, we are talking about "waste" in the system. For example, if you are using a Bug Tracking system, then you have wasted time/effort in your system. That is also inherently non-Agile.

      If you want to talk about short feedback loops, I would ask "of what? for what purpose?" For example, a short feedback loop to validate that the software is "fit for use" -- have the customer/user sit at the programmer's/team's computer. A short feedback loop that the software builds correctly -- have a Continuous Integration server in place. A short feedback loop that no past bugs have reappeared -- automate the checking of these bugs. A short feedback loop on a potentially infinite space of possible (mis)interpretations of the software/system -- hire some good exploratory testers.

      "Testing" is a dirty word that hides many meanings, so our discussion here may never end. This (blog) is not the best/ideal medium to maximise clarity on the distinctions between and within the different practices. I hope this has helped spark some new ideas or connections.