Something Interesting about Reporting Bugs

I just happened by this link just now: for the "Common Weakness Enumeration" (CWE) project.

Here's the blurb:
"International in scope and free for public use, CWE™ provides a unified, measurable set of software weaknesses that will enable more effective discussion, description, selection, and use of software security tools and services that can find these weaknesses in source code."

I'll have to look into this later. Don't know anything about it yet.

To be a Good Tester you must think like a Scientist

It's funny how many times this particular analogy keeps coming up. The comparison between Testers and Scientists, and the similarity between testing and the Scientific Method.

Most recently it occurred to me while reading one of my son's chapter books. FYI: "chapter" books are short books broken into chapters for kids just beginning to read on their own. Usually for kids ~ 7-8 years old. These aren't Harry Potter books.. most of them barely reach 80 pages.

The book that caught my attention is called "Jigsaw Jones #9: The Case of the Stinky Science Project" by James Preller. The main characters are in Grade 2 and in this particular story their teacher was giving a Science lesson:
"The world is full of mystery. Scientists try to discover the truth. They ask questions. They investigate. They try to learn facts. Scientists do this by using the scientific method."

The teacher then handed out sheets of paper which read:


1. Identify the problem. What do you want to know?
2. Gather information. What do you already know?
3. Make a prediction. What do you think will happen?
4. Test the prediction. Experiment!
5. Draw a conclusion based on what you learned. Why did the experiment work out the way it did?

Back when I used to teach High School Physics, I recall giving a set of steps very much like this one. I might have used the word "inferences" instead of "conclusion" but otherwise it's a pretty good list.

When you think about testing software, generally you run through the same process and set of questions. If you don't think about each of these questions, then you're probably not doing something right.

For example, here are some questions that come to mind when I think of the Scientific Method applied to testing software:

1. Identify the problem.
  • What are the risks?
  • What is the particular feature of interest?
  • What is it you want/need to 'test' and 'why'?
2. Gather information.
  • What references are around to tell you how something should work? (e.g. Online Help, manuals, specifications, requirements, standards, etc.)
  • What inferences can you deduce (or guess) about how something should work? (i.e. based on your experiences testing similar apps, or other parts of the same system, etc.)
  • What can you determine by asking other people? (e.g. customers, programmers, subject-matter experts, etc.)
3. Make a prediction.
  • Design your tests.
  • What is your hypothesis?
  • What are the expected results?
  • Think about any assumptions or biases that might influence what you observe. How can you compensate for these?
4. Test the prediction.
  • Set up the environment
  • Execute the tests
  • Be creative! Make as many observations as you can.
  • Collect data
5. Draw a conclusion based on what you learned.
  • Did you observe the expected result? Does this mean the test passed? Are you sure?
  • If the test didn't turn up the predicted result, does this mean the test failed? Are you sure?
  • Revise the test design and any assumptions based on what you observe.
  • Do you have a better understanding of the risks that drove the test in the first place?
  • Do you have any new questions or ideas of risks as a result of this test?
  • If you collect a lot of data, summarise it in a chart that can help demonstrate the trend or pattern of interest.
  • Write a few words to describe what these results mean to you. (You might not have all the information, but don't worry about that. Just say what you think it means.)

In general, I find the Scientific Method to be a very good guideline for both beginners and experienced testers alike. Wikipedia has some entries on the Scientific Method as well as a Portal. I think it's a good read. I'd recommend those pages to anyone serious about becoming a good tester.

If there are things on those pages that you aren't sure about, look them up! You might just learn something new about how to think about things that will help you do your job better.

Happy Learning!

Ubi Dubium, Ibi Occasio (Opportunitas).

Where there is doubt, there is opportunity.

That's my new motto as a Software Tester. =)

It came to me when I read a comic that I borrowed from a friend recently. You see, I'm a fan of the writer J.M. Straczynski, so my friend told me about a comic that JMS had written a few years ago called "Supreme Power" (Max Comics). If you've ever read a comic, you'll know that each issue or story usually has a separate title. Issue #8 of Supreme Power has the episode title "Ubi Dubium, Ibi Libertas" which he translates for you on the last page as: "Where there is doubt, there is freedom."

That title made sense in the context of the story, however I couldn't stop thinking about that phrase for several days afterwards. There was something about it that I liked, and yet it didn't completely fit for what I feel I do as a tester.

When there is doubt, I go to work. I have fun. I explore. I discuss and test. Most of software testing is about working in the space between the vagueness of specs or requirements and their ever-changing interpretation into working software code. So really, doubt is everywhere. Doubt is the whole thing! Where there's doubt, there's opportunity.

Over the years, I have often contemplated different analogies and ways of describing what I do as a software tester so that I could explain it to others who don't really understand the role. (For some reason, if you're not a programmer and if you're not doing Support or Sales, then most people don't really understand what else is there.)

So now I feel that I'm really close to a good analogy. Doubt is the space where I work and play in. I'm a Doubt Management Specialist or Facilitator, if you will. Someone writes up some specifications based upon what they think the customer wants to the best of their knowledge and understanding. (There's doubt.) Someone else interprets those requirements and transforms them into mathematical algorithms that perform some function on a computer. (There's more doubt. Is that like Doubt-squared? ;-) )

Enter the Software Tester, the go-between. We see the doubt in the specs and come up with some ideas (i.e. tests) to explore the meanings and possible interpretations. We see the doubt in the software as features are incomplete, don't perform as expected, are insecure in some way, unusable or not robust enough according to our interpretations and experiences as users of the technology.

If there was no doubt in the whole process, I don't think we'd have anything to do. We'd totally be out of jobs. Maybe we could be Project Managers or Programmers I suppose. ;-)

So you see, where there is doubt, there is opportunity for us. Opportunity to explore, to test, to ask questions, to find bugs, to strengthen understanding, to clarify, to add value.

If you don't see the doubt, I don't believe you are adding any value. At the end of the day, I believe the best testers are the ones who add value by reducing the doubt in the development project.

It's another way of looking at the problem. I kind of like it. What do you think?

Bonitatem et disciplinam et scientiam doce me (et gaudium ostende me)

There are a few things that I recall from my high school days -- the school motto was one of them: "teach me goodness, discipline and knowledge." It served me well as a guide over the years but I always felt like there was something missing in that motto alone.

Years later, when I was working as a high school teacher, the three main things that I focussed on getting across to my students were: Attitudes, Skills and Knowledge. That was pretty similar to my old high school motto, so it was easy for me to remember and apply. I thought those three things were a good guide for my classes, but again I felt like there was something a little too dry about it that I couldn't quite put my finger on.

One day, during a contract placement teaching one of my favourite topics - Physics - I got some interesting advice from a seasoned teacher on my teaching approach. He said that I did well in the class but that he lived by a different motto: "any class that goes by without some humour and laughter is an hour wasted of the students' lives." And of course he said it with a smile. =)

It's interesting because I have always had this joie de vivre that everyone who gets to know me notices. I like to smile. I love to make jokes. I love to stop and smell the roses and listen to the wind and the trees. I have a penchant for puns and when the going gets tough, I get silly. =)

Yes, I suppose there are times when seriousness is called for. However, for the most part, life is too wonderous and entertaining not to be silly and enjoy and to make other people smile and lighten up too.

Up until that moment when I got that feedback from that teacher, I had been working on a different model: one where you are serious in your professional life and save the humour for your personal life.

After that moment, I ditched the old "professional = serious" model and decided for myself that every hour worth living was worth enjoying, regardless of whether I was at work or at play.

Needless to say, after that advice I began to share my passion for the subjects I taught in the funnest ways I could think of while still adhering to my teaching goals and objectives. A decade later, if you have ever attended one of my QA or Software Testing workshops, you would also know that it is always with a certain amount of levity that I share my experiences and knowledge. After all, if you can't laugh at yourself, who can you laugh at, right?

At work I'm the same. Always the professional, and always with the smile on my face or joke to add. Don't get me wrong.. I'm not a clown every hour, but not a day goes by that I don't try to do or say something to add a little levity or have some fun.

Here's where it gets interesting. 8-)

Not everyone sees it the same way. Some people are simply "no fun at all." In fact, I even worked with someone once with the nickname "the Director of No Fun."

(Aside: Hmm, I wonder if it is a particular trait of middle-management to have the least amount of a sense of humour? Maybe it's because they realise that their jobs are the most expendible, so they tend to try to look and act important all the time?)

Oh, but it's not just people at work.. there are many people out there in the world who have simply forgotten what it is like to have fun when interacting with other human beings. To these people, adding a smile to a thought or reply suddenly takes on a different meaning. Now you are being facetious or patronizing or sarcastic. Your words are not only not meant to be taken seriously, but you are insulting as well. Nice.

Interesting. Personally, I find that these words tell me more about the listener than they do about the speaker. In fact, it tells me that the listener is not really trying to listen at all because they are too busy imposing their own negativity on the world around them thereby tainting everything that they hear.

I feel very sad for these people. It really is a shame that negativity appears to be more predominant in modern society than positivity. Acting and speaking in ways that emphasize goodness, joy and happiness generally makes you the odd one out.

(Aside: It's fun to meet other 'positive' people at parties and elsewhere. It really causes an exponential increase in positive and silly energy that just invigorates and excites me. I don't think that most people can handle having more than one of us around at any one time. =D )

Which brings me back to my old alma mater's motto: "Bonitatem et disciplinam et scientiam doce me." That's a good start, however, I'm also adding: "et gaudium ostende me." My Latin might be a bit rusty, but I'm pretty sure that translates as "and show me joy/happiness/fun."

It's not enough to teach and be taught the right attitude, the right information or the right skills. I want to see the fun that other people have in their work and I want others to see the fun that I have in mine.

If you happen to see me with a smile when you meet me, please feel free to smile back. =)

The Three Physical Requirements of a Good Software Tester

There are three physical elements that I find a good software tester must have:
  1. Good working senses
  2. Brain - ability to think
  3. Heart - someone who cares

Your senses (sight, sound, touch, etc.) give you the information that you need to process.

Your head helps you process the information and form them into ideas and models to work with.

Your heart gives the information meaning. Someone with heart is someone who cares about others and about the quality of their work. Without this, you are little more than a computer.

Interestingly enough, there are some people with reasonably good-working senses who are still unable to see. I think that it is likely an impediment from their head or their heart that prevents them from seeing. Can this be fixed? Perhaps -- if the person genuinely wants to see. Not everyone wants to see.

The problem is no longer a mechanical one but rather a psychological one. That's tricky.

All written communication is fundamentally flawed. It tries to capture some of the above 3 elements, but usually fails to really grasp the element of 'heart'. (And Poetry is likely the exact opposite - more heart than anything else.) I don't believe that any useful communication can take place without being physically present with the other person. There are many things said and understood between people that may be poorly or incorrectly inferred through written communication alone.

A software tester is an Information specialist. Do you have what it takes to be really good at it? Do you really care?


Sometimes after a long day at work, I need some time to settle my thoughts from the day before I can resume normal life. I call this my 'decompression' time - similar to the decompression period required for deep-sea ocean divers who have to sit in a decompression chamber before they can return to our normal Nitrogen-Oxygen atmosphere at sea level.

There are days when I'm just so wound up about things at work - someone or something that preoccupied a lot of active think-time - that even by the time I return home I'm still not able to absorb new information until my brain can settle. Sometimes when someone tells me something during this period, it just never makes it into my long-term memory.

I read a phrase in a book recently that I think captures this perfectly. The book is Dragonquest by Anne McCaffrey, and this phrase is used to describe what one of the leaders is thinking when he just returns from a battle after a particularly intense day:
"A man needed a few minutes to digest chaos and restore order to his thinking before he plunged into more confusions."
Sometimes, I find that this decompression time can take up to an hour after I leave work. And that's from a job that I like working at! (I've worked at some places where my thoughts never settle and the stresses prevent me from sleeping. You know you need to move onto someplace else when that happens.)

I call this a short-term decompression period -- which is based on the events of a particular day.

Sometimes when the stress and activity level is really high and prolonged at work for several weeks, I notice a different kind of conditioned response happen. My head and body become accustomed to a high-level of thinking and action so that when the stress is removed, my body feels almost lost and weightless for up to several days afterwards. I think of this as a long-term decompression period.

Here's an example. Last summer, we had a particularly difficult software release because the deadline was tight, there was a lot of complex functionality to cover and there was clearly insufficient people to help us reach the target. As a result, my colleague and I put in over 200 overtime hours in the course of a few months in order to help make our targets. When the release finally shipped, I found myself wandering around my office space for a few days looking for a fire to put out, a problem to solve, a meeting to go to, or some late-night that I needed to come in for. But there wasn't anything. So I just had this doe-eyed, deer-in-the-headlights look about me for a few days until my brain and body could become re-accustomed to a normal workload and workday schedule again.

Sometimes you may not even be aware that your brain is not ready to process new information until the other chaos had been digested. When I come home from work after a particularly hard day, I try to avoid doing anything that will require long-term memory for a short while. My kids are a great distraction for me.. I can get lost in their world for a short time to help me clear my head so that I can process new information and new chaos from my second career - home life. =)

Why do we Test?

As with other good questions, this one can be answered with another question:

Why does someone want you to test?

Do you know the answer to that question? If you aren't sure, start by asking the person you are working for.

Can Practitioners write Academic Quality papers?

I've thought about the idea of the AST Journal for some time now. In principle, I really like the idea. One thing that I've worried/wondered about though is the idea of writing a paper that stands up to academic scrutiny (or pretty close to it anyway).

Today, I happened to notice the following Quote Of The Day in the weekly StickyLetter (from
"Science is supposedly the method by which we stand on the shoulders of those who came before us. In computer science, we all are standing on each other's feet."
~ Dr. Gerald J. Popek

At first it made me laugh, then it made me wonder about what it would take to write a good paper. I've been reading some good articles and papers on Software Testing lately and if there's one thing I know it's that I don't have the time to do all the required research to produce a really good paper.

I, like perhaps many experienced testers, learned my craft by doing. I picked up ideas here and there over the last 15 years (a conference presentation here, an email thread there, a passing conversation with a colleague or manager, and so on), that when applied I started to notice the patterns of what works and what doesn't. I then began to build up some notion of the importance of contextualisation in my successes and failures.

I've written small snippets of perhaps good ideas and thoughts in the past, and I've communicated some other good ideas during workshops and presentations, and I like the idea of sharing knowledge. One thing I don't think I have the time to do, though, is go through all the previously published material to see who thought of what first. It may be important for research historians, but I don't really believe that I have the time or resources available to me to really do a proper job of it. It almost seems weird or absurd to me from one perspective too.. who thought of it first? "Well, I thought of idea 'foo' all on my own. I can list you all of the experiences, conditions and factors that led to these inferences and the outcomes of the applications of these thoughts. I didn't read the idea anywhere, so how can I attribute the idea to someone else?"

So where would I begin to look in the published literature to reference who actually came up with the same idea or portions of it before me? Or worse. What if to do a really good job of it, you need to reference articles and papers from across various disciplines ( i.e. computer science, psychology, education, engineering, philosophy, sociology, etc.)? That is, what if the profession of Software Testing is really just the centre of a whirlwind of various professions and disciplines all combining into patterns that we each interpret in different ways to successfully complete the tasks before us? How would you know that you've referenced enough people or ideas to do a proper job in your paper?

I just feel so overwhelmed at the prospect sometimes. It's not writer's block.. it's the thought that spending a day or two articulating a few good ideas and the contexts in which they seemed to be successful for me might require weeks of research to support in good academic fashion. And even then, I know I would likely miss some other good referenceable point or idea or person.

Is it possible to do a good job writing a good paper and still have a day job? Perhaps. Is it possible to do a good job writing a good paper and still have a life? I don't think I could. Maybe we all are standing around on each others feet sometimes. So how do we get past this? How do we turn all this information into knowledge so that we can have some progress? How do we help the next generation so that they don't have to reinvent all of the same ideas that we've had to discover on our own over the last three decades?

Learning not to be the best to win friends

When I was young, I developed a habit that I don't really know how I got started on. I'm sure some Shrink could probably extract it from my memories through some quality sofa time, but I'm not really interested in that. The problem is that the habit stuck and it seriously affected how I interacted with others.. but I didn't notice right away. I would have had to have been paying attention to notice. Unfortunately, I only developed that observation skill much later in life.

So here's the thing: I was a perfectionist. I know what you're thinking - every tester says that. Well, I was pretty methodical in my approach and fairly obsessive about it too, right from an early age. If I didn't get perfect on a Math test, I practiced the questions I got wrong until I always got them correct. If there was skill that I didn't excel at, I practiced until I got them down - from video games to languages, from sports to mechanics, from music to cooking ... and so on and on. I was only limited by the resources available to me and as a result I became quite good at a lot of things and a real Information Investigator too. By the time I was twelve I could navigate any and all of the libraries in the city of Toronto (it's a big city). By the time I was in my teens (in the mid-1980's), I learned to navigate the newly developing electronic information systems using a modem hooked up to my Atari 8-bit computer. The advent of the Internet in the 90's was an absolute dream for me! I was an information junkie and I loved to learn. It became a habit: learn something, learn as much about it as possible, get really good at it.

One day I discovered that there's a more efficient way of describing such a person: a know-it-all. Ouch. Kind of harsh. What was worse was discovering that my girlfriends were intimidated by how much I knew and how well I did at school. That was stupid, right? I mean learning was like a video game for me. I did it to see how much I could cram into my brain before I got a brain cramp or ran out of information sources. I didn't do it to feel superior or make anyone else feel inferior. It was just who I was and what I did.

Well that sucked. I wanted people to like me. So I did what came naturally - I pretended to be dumb and intentionally did worse in school. That became my new game - to try and intentionally not do my best, not be perfect. The problem was that my habit was still there, so I had to keep finding ways of hiding what I knew. My teachers didn't like this new turn of events, of course, but I had some good friends and good times so I didn't really care what the teachers thought.

That worked okay for a while I guess. I only thought about dumbing down when I cared about making a good impression. When you're a teenager, that's not really all the time. ;-)

Fast-forward a decade. After graduating from University (finally! they had to force me out because I didn't want to leave :-) ), I discovered a similar but more distressing dynamic in people interactions in the workplace. So here's a question - don't you want to work with people who are good at what they do? Don't you want to be the best at what you do? Apparently, I discovered, a lot of people kind of don't really care all that much about work. It's just a day job that pays the bills for them. And that whole perfectionist thing I've got going on not only intimidates some people, but it also (apparently) makes them look and feel bad.

Well that sucks. Again.

Oh ya, and it gets worse. If you don't kiss the butts of the people in upper management, lie to them and praise them and make them look better than you, you're not likely going to advance within your organisation. It was at this point that I discovered another trait that I didn't know I had - integrity. Basically, anyone who wanted me to kiss their butts could just go ahead and kiss mine. :-b

I wasn't about to dumb myself down in the workplace. Certainly not for some management position amongst all the other self-gratifying, ego-centric, self-praising half-wits. No way. Beware the people you surround yourself with indeed!

Of course, a new lesson I learned was how to balance Integrity and Tact (diplomacy) so as to maintain working relationships. After all, I still wanted a paycheck!

Over the course of several years and several companies, I explored and observed the various nuances of interpersonal dynamics in the workplace with regards to the impact of a product and/or technical expert. Basically, I keep getting better at what I do so I've had to learn how to be good at what I do without making anyone else feel bad about themselves and not sacrificing my integrity when dealing with self-absorbed managerial staff. It's not easy, but I love a challenge! =)

So, why do I bring this up now? Today I had a conversation with my boss - just a regular one-on-one chat that we have every other month at work - and he said something that reminded me of that old habit. He said that other people at work have noticed how much more relaxed and approachable I've been lately.

I know that when I first started at this company, I was a little gung-ho on the whole perfectionist thing (again) but I thought I had turned down the volume on that habit. Some of it is left-over from my Software Quality Assurance days as a "Quality Crusader". I try to remain focussed on Software Testing these days, but it's hard to keep my mouth shut sometimes when people are doing blatantly improvable tasks, and since I have a stake in the success of this company I want to see everyone doing a good job - err, well, the best job they can be doing at the time, that is. Alright? =)

I know I had turned down the volume on that aspect of myself, so I had to think for a minute about why my boss was suddenly remarking on the observation that other people had noticed I was more relaxed around the office.

There was something, an event, that happened last October (2006) that has changed my life forever. It's still a bit too personal to mention here right now, but needless to say it got me thinking about where I was spending most of my time. Last summer I had spent too much overtime at work, and perhaps that got me a bit more stressed than usual -- and was likely the comparison benchmark for my boss' observation.

Since November 2006 my attitude towards work (in general) has shifted again. I kind of don't care about it anymore. I think something inside me snapped. Don't get me wrong, I still love Software Testing and still want to keep getting better at it, it's just that now I think I've finally broken free from the expectation of perfection in others. The transference of my own preferences onto others is a dangerous thing in sneaky and subtle ways that you don't usually see coming.

The reality is that I've got more important things to worry about than what other people should care about. If someone cares about what I think, then that's nice, I'll offer my opinion. Otherwise, do whatever the heck you want so long as it doesn't affect my ability to get my job done.

I've read some good stories and articles over the years, and one day I think I would still like to work in an environment where I could have a mentor that I could learn from. Someone that would expect me to be better than who I am and help me to reach my potential. Somewhere where I could demand the best from my team members and actually get that quality because they care too.

Right now I'm content to work with people I like and who like and support me. That means a lot too.

Never Test Before 4

Kind of a silly thought, I know, but it keeps coming back.

I work in a small agile development environment. Development works according to 2-week cycles to complete chunks of code. I keep noticing that anything prior to Cycle 4 or 5 is usually incomplete and unstable for testing. The first several cycles are when all the foundational architectural changes are usually happening.

So we can never really test before (cycle) 4. That's fine. I've got these Ruby scripts to keep me busy in the meanwhile. =)

Observation on the Proofreader Effect

I've been working on some performance test scripts using Ruby (Watir actually) over the last few weeks, and have been happily rewriting the scripts I first wrote a year ago. (Programmers call this activity refactoring.) I've learnt a lot about Ruby and scripting web apps over the last year. One of the biggest helps came when I read Brian Marick's new book "Everyday Scripting with Ruby". Thanks to that book, my performance test scripts are really slick now and look more like a programmer wrote them. But I digress..

The thing that I've been thinking about over the last few days is the problem of testing the scripts that I've written. Any good tester would never trust a programmer to write error-free code, so why should I trust myself to? But then who should test my scripts? Well, there really isn't anyone else around who can right now so I have to do it myself. Is that a problem? I don't think so.

I'm the biggest stakeholder who cares about these scripts working correctly, while my boss is mostly interested in the numbers and my analysis. So I ran the scripts and worked out the kinks one section at a time until I was able to run them straight through several times without error.

Is that good enough testing? Well, I got the coverage and measurements I wanted, so I guess so. The scripts don't have to be perfect, they just need to give me the data I need. So, it's all good.

Right. I completed the analysis for this run and then started to compare the numbers against the benchmark numbers from last year. It wasn't until several hours later that I noticed a typo in the output. Eek!

I'll just sneak back into the code and fix that. No one saw that. I'll just re-run the scripts and make sure the output looks "clean" this time. Great! Looks fine now.

So how did I miss that typo? I thought about this for a while. I think the proof-reader effect is like a FIFO buffer. That is, I don't think I could have seen this bug until I got the other bigger bugs out of the way.. you know, like the ones that prevented the script from completing or collecting the data I needed in the first place.

First in, First out. Get the big ones out of the way and then I can see the smaller ones that were hiding underneath. The typo was always there but I was just temporarily blinded to it because my attention was so focussed on the bigger fish.

So was I unqualified to test my own code? I don't think so. I caught all the bugs I cared about. It just took me a few days to find them. Would a separate tester have found the typo before me? Maybe, maybe not. The FIFO effect only affected *my* ability to see the little things until the bigger ones were out of the way because I was the one who wrote the scripts. A separate tester would have a different perspective and shouldn't be affected by this FIFO/Proofreader Effect in the same way.

We do Exploratory Testing almost exclusively on our products. When I test, I don't see the same effect happening to me. It's just a matter of time until I get to a feature or page and then I hit it like a whirlwind and move on. It's quite cool and effective. Defect finding rate starting to slow down? Switch to another Test Technique - voilĂ ! More bugs. All the Risks addressed? Move on.

I've seen a number of conversations happening on some of the message boards questioning whether or not a programmer is able to test his or her own code. After this recent experience, I think if the desire is there and there is enough time, then yes, she should be able to find all the bugs that matter.

Once again, a separate pair of eyes not constrained by the FIFO effect would likely speed up the process. Nothing we didn't already know. A Tester helps you to find the bugs that matter sooner rather than later. Well, a good one will anyway.

Sometimes "Good Enough" isn't good enough

I've been a big fan of the idea of "Good Enough" software testing over the last decade. Rather than thinking that the problem of doing good Software Testing is akin to "Digital" technology with it's complete, precise values, I've thought of it more like "Analog" technology with the big dials and reasonable, approximate (and cheaper) signals.

This past week, I've watched my seven year old son play with a new LEGO set that he got for Christmas. It's a neat mechanical lego set that lets him build a motorised helicopter, cool car, or attack crab thingy. (ASIDE: I can't begin to imagine what the Marketing team's conversation was like when they thought up that last one!) I noticed when he completed the helicopter and turned on the motor, that it didn't sound right to me. So I went over and took a close look at his creation. It looked correct. There didn't seem to be any missing pieces, but when he turned it on again, I noticed that not all of the gears turned together consistently. I picked it up and took a really good look at it. Not knowing much about how it was built, I just randomly squeezed together lego pieces that weren't tightly packed together whenever I came across them.

There was one set of lego pieces that had a gap of about a millimetre. When I squeezed them together, it made a (good) snap sound. I asked my son to turn on the motor again and this time it not only sounded correct, but the gears all worked together in perfect synch also. Voila!

I thought about this for a few moments afterwards. Up until then, my son had worked on the premise that if the lego pieces were reasonably attached, that it was "good enough". He didn't need to have a tight fit between every single piece to see the finished product. I mean, it looked like the complete picture of the helicopter in the instruction manual, so what difference would a small gap between a few pieces make?

In this case it made a big difference. If it needs to work like clockwork, then "good enough" is probably not enough.

So what's the tie in to Software Testing? Well, just how scalable is the "Good Enough" approach? For me, it's always been about testing to the most important Risks and using whatever tools and techniques seem appropriate to the situation at hand. It's always seemed kind of foolproof to me.

Maybe my Digital/Analog analogy is a flawed one. I mean, Analog technology has its limits and is not very scalable. Digital technology is more precise and can handle more information. Is there a point when a Digital solution gets so large that it requires an Analog approach again? (I think the answer here is 'yes.')

Is there a time when "good enough" needs to be replaced with a more complete, structured or methodical approach to software testing? I can't think of any situations like that right now, but that doesn't mean there aren't any. That is, I can't think of a time when I wouldn't want to say that good software testing has to strike a balance between the economics, quality and time to market for a product or system. Shipping with bugs is okay if you know that they aren't critical or life-threatening.

So perhaps "good enough" doesn't always apply when we're dealing with real-world objects like lego creations, automobiles, watches, et cetera. I think that it still holds pretty well to the virtual world of software testing. Until someone can give me a good example or two of when "good enough" wouldn't be good enough for testing software, I think I'll chalk this up to another distinction between testing software and testing hardware.