Reflection on my Testing workshop at the KWSQA Targeting Quality Conference

At this year's KWSQA Targeting Quality conference I gave a half-day workshop titled "Exploratory Testing Basics".  I originally proposed that title since I thought it followed nicely from the shorter workshop I gave at the QAI TesTrek conference in Toronto last Fall. I thought to myself - I'd like to redo the exercises again, change up a few things and it should be a piece of cake.

As the Winter months progressed into Spring, I began to worry about my workshop idea more and more.  You see, the exercise I gave at the QAI conference, while fun and appropriate, only really covered one aspect of Exploratory Testing - a broader framework. Perhaps that isn't enough?  What is enough, then? What makes up the "basics" of ET?

You see, when I teach ET, it's usually one-on-one and I spend 2-3 days just to cover the basics.  It takes me a few more days of pair testing and debriefing/coaching to help the new tester put everything into practice. It really is quite complex and a lot of ideas and models may seem abstract until you try them out and adjust with good feedback.

One of the hardest parts, I feel, is trying to teach certain techniques when the tester doesn't see a need for it.  For example, Pairwise Analysis.  I was introduced to Pairwise Analysis as "Functional Analysis" about 12 years ago and I got it right away. I had done more complex mathematics in university, so it wasn't the math that was the hitch for me - it was knowing when it might be useful and then applying it.

The first project I tried it on was when I was asked to perform Installation Testing for a desktop application. If you have ever done this kind of thing, you will know that there are *many* features and variations that all conspire to convince you that it is a daunting task that may consume every waking moment for weeks on end if you ever want to try and cover all of the possible combinations of systems, hardware, software, feature selections, and so on.  Enter Functional/Pairwise Analysis. I did the math; came up with a set number of test scenarios; performed them; and reported my findings in record time -- only a few days instead of the customary 2 weeks that it had taken on previous releases.

That was really cool! I had a new tool in my Tester's Tool belt and I couldn't wait to try it out again.

Years passed and several failed attempts later to teach other testers this cool technique, I finally stumbled upon the idea of Just-In-Time teaching. That is, rather than try to teach a tester all the techniques and models that I have learned over two decades and cram them into a few days, I will wait until they are presented with a problem and introduce the appropriate technique then.

There are two important take-aways for me with this JIT approach. First, it is really effective and the tester gets it - great! Second, it may take a long time before a tester is presented with the situations where certain techniques apply - not so great.

Present day.

So, what do I cover in a 3-hour workshop that I would consider 'good enough' to cover the basics of ET?

The answer, of course, is that proper workshop coverage will be the gap in knowledge from where someone is to where you want them to be. Unfortunately, the knowledge/experience starting point for each individual attending my session will be vastly different, their needs for this information will be different, and unless they have specific concerns some of the ideas very likely won't stick.

There are many unknowns in that equation. So how can I plan an outline to cover this unknown gap for unknown purpose(s) in a short amount of time? Stress.

What did I decide to do? I didn't plan an outline. I took a page out of Jurgen Appelo's book and I had the attendees self-organise and decide.

I handed out index cards and asked each person to create a user story card with their goal for the session. We stuck them to a wall with the heading 'Backlog' on it. On another part of the wall I had a Task board with the headings 'To Do', 'In Progress' and 'Done'.

I asked all the attendees to decide as a group, select the top 4 cards that they wanted to cover this session, and place them in the 'To Do' column. They didn't believe me at first when I asked them to do this and kept asking me to decide. It was really cool to watch the transformation happen and have them take ownership for deciding on the workshop goals.

I read each card, picked one, moved it into the 'In Progress' column and began.

So what did I actually cover? The attendees grouped most of the cards together into one big group and the one they picked said that they wanted to "gain an understanding of ET techniques."

Several years ago, I worked in a Financial Services organisation and was faced with a few audits by Banks where they needed to understand what testing artefacts we produced and why they didn't match their traditional Test documentation expectations.

Knowing that I couldn't just show them test session notes, test guides and other Exploratory Testing artefacts because they wouldn't understand what they were looking at, I created a presentation that had 4 parts:

  1. The Challenges in Testing
  2. What is Agile Development?
  3. An Overview of our Systems Testing Approach
  4. Examples of Testing Artefacts we Create on Software Projects

The auditors wanted to see the last part, but I explained that I needed to cover the first three before they could understand what they saw.

In the "ET Basics" workshop at the KWSQA conference, I covered the important aspects of the first three sections above (I have expanded upon the original presentation over the years), and supplemented that with some additional exercises to cover a critical aspect of Exploratory Testing - Test Design.

Did I meet the objectives? I think so. I covered the underlying principles and ideas that form the basis of good testing, "exploratory" or otherwise. The next step would be to look at some specific models and structures and have them perform exploratory testing.

While performing ET may be an exciting and enlightening activity, it is also a complex and challenging one that requires careful debrief.  In other words, we will need more time.

Will I give this same presentation again? Not sure. I might if that's what the attendees want to see. I'll let them decide.  If they want to practice something instead, or focus on managing such a testing activity, then we will do exercises for that instead.

I had fun in this workshop and learned some great things from an exercise I tried for the first time. I also thought the training video I showed was an appropriate fit for both a Friday afternoon and the topic covered. ;-) I can't wait to build upon what I've learned from this workshop experience and offer more in the future!

Cheers! Paul.

No comments:

Post a Comment