Showing posts with label quality. Show all posts
Showing posts with label quality. Show all posts

Agile Testing vs Exploratory Testing

This month I will be doing two sets of training sessions - one for Agile Testing (AT) and one for Exploratory Testing (ET). I have extensive experience with both and I enjoy teaching and coaching both activities. This is the first time that I have been asked to do both training sessions at basically the same time, so I have been giving the relationship between the two some thought.

Over the past decade I have sometimes struggled with the difference between the two. I admit that I have on occasion even described them as being the same thing. Having given it some thought, I now think differently.

The insight came to me when I had a conversation with someone about both topics recently. The courses I'm doing have different audiences, different outlines, and different expected behaviours/outcomes. Yes, there is some overlap, but not much.

I have written previously about what I believe is ET (it's an interesting post - I suggest you take a moment to read it if you haven't already). Near the end of that article, I mention Agile Software Development and the Agile Testing Quadrants, so there is some relationship between the two.

ET is sometimes described as an exemplar of the Context-Driven Software Testing community. If you read through the seven basic principles of the Context-Driven Testing (CDT) School you may see where there are similarities in the philosophies between it and the values & principles listed in the Agile manifesto. Things like:
  • CDT: People work together to solve the problem
    • Individuals and interactions over processes and tools
    • ...face-to-face conversation
    • Business people and developers work together daily throughout the project
  • CDT: Projects are often unpredictable
    • Respond to change
    • Welcome changing requirements, even late in development...
    • Deliver working software frequently...
  •  CDT: No best practices
    • Continuous attention to technical excellence and good design ...
    • The best architectures, requirements, and designs emerge from self-organizing teams ...
And we could probably make a few more connections between Agile and the CDT principles.

So what are some of the differences?

Test Management is Wrong

Test Management is wrong. There. I said it.

I can't believe it took me this long to notice the obvious. If you are doing Software Development in any fashion, and are worried about how to manage your testing to develop a "quality" product, stop it.

Let's be clear about what I mean here. If you consider any software development life cycle (SDLC) process, you will find activities like the following arranged in some fashion or other:
  • Requirements gathering, specification
  • Software design
  • Implementation and Integration
  • Testing (or Validation)
  • Deployment
  • Lather, Rinse, Repeat (i.e. Maintain, Enhance, Fix, Mangle, Spin, and so on)

These activities aren't Waterfall or Agile or anything else, they are just activities. HOW you choose to do them will reflect your SDLC. I don't care about that right now. The part I'm picking on is the Testing bit near the middle, regardless of whether you do them in an agile, Waterfall, or some other way.

In particular, I am picking on the fallacy or myth that a good Test Management plan/process is what you need to develop and release a high Quality product.

Quality Agile Metrics

I was asked recently what metrics I would collect to assess how well an agile team is improving. I paused for a moment to scan through 12 years of research, discussion, memories and experiences with Metrics on various teams, projects and companies - mostly failed experiments. My answer to the question was to state that I presently only acknowledge one Metric as being meaningful: Customer Satisfaction.

We discussed the topic further and I elaborated some more on my experiences. Regarding specific "quality" metrics, I explained that things like counting Test Cases and bug fix rates are meaningless. I also referred to the book "Implementing Lean Software Development" by Mary and Tom Poppendieck (which I highly recommend BTW) which warns against "local optimizations" because they will eventually sabotage optimization of the whole system. In other words, if I put a metric in place to try and optimize the Testing function, it doesn't mean the whole [agile] development team's efficiency will improve.

It needs to be a whole team approach to quality and value. Specific measurements and metrics often lead to gaming of the system and focus on improving the metrics rather than putting the focus on delivering quality and value.  If the [whole] team is measured on the customer satisfaction, then that is what they will focus on. I have long since stopped measuring individual performance on a team.

I haven't stopped thinking about this question though, so I put this question out on Twitter this morning:
Aside from Customer Satisfaction, are there any other Quality metrics you'd recommend in an #agile environment?

Testers, Learn about Agile (and Lean)

Let me tell you about something called Dramatic Irony. You see it in movies, television shows, plays and in many other places. It happens when you (as the audience or observer) see or understand something that the main characters don't. Often times this is funny, sometimes it's not. Personally, I am one of those that likes to laugh when I see this happen.

On my learning/education quest over a decade ago, I took many different positions and roles within various IT organisations so that I could learn different aspects of Quality. I went through various phases, and the one I am least proud of was the "Quality champion." This wasn't a job title so much as a belief that (mis-)guided my actions. The role/part/perspective came mainly from believing what my employer(s) told me at the time - namely that "the QA/Test team was responsible for quality."

If you have worked in Software Development for a while, and perhaps for a larger organisation, you have likely seen someone who believes they are a Quality Champion. They don't want to see any (known) bugs go out; they check up on everyone in the team to see that they have done their reviews or had someone else inspect their work before passing it onto the next person/team; they join committees to create, document, maintain or present processes that will increase the quality of the delivered products/solutions; and so on.

Ah, the poor misguided fools.  Bless their hearts.