I got called into a conference call today because of an emerging question a client had about combination testing.

They wanted us to test their online survey, an application which consisted of several questions, most of which had an array of answers from which the respondent could pick. At the end of the survey, the respondent gets a score.

One section of the survey had 40 questions. Each question had anywhere from 2 to 28 possible answers. This made a total of 23 quintillion combinations to test.

So what to do when they only have a few weeks to release?

Enter a heuristic combination method called all-pairs. All-pairs is a mathematical technique that takes a table of options and pairs an option in one column with each option in each of the other column.

This approach is useful for flushing out the risk bugs that exist when two options are paired.

There are two FREE tools I know about that will do this kind of analysis: all-pairs tool from James Bach and PICT, a free tool from Microsoft.

Using both tools, I found that the 23 quintillion options were paired to just 673 test cases. The tools produce a nifty Excel-readable table of test cases, consisting of rows for the tester to follow — telling them which answers to select for each question.

The important point to note when using all-pairs is that it is a heuristic — a fallible mehtod for solving a problem — aka “a rule of thumb.” As such, it should be told to stakeholders that it’s not perfect, but it can help expose certain risks quickly.

For the PICT tool (which also does triples and a variety of other options), click here. For my brother’s free all-pairs tool written in Perl, click here.

TV Shows for Testing

I love tv. Perhaps too much. But I don’t sweat my addiction too much because most of the shows I like have a testing component to them.

Notice that your favorite shows, movies, books all have an important element that you’ll see in testing.


The difference between “desired” and “actual” is called a problem (It’s also called a bug.)

There are many defintions of testing, but my favorite is the discovery of problems as you assess capabilities.

Here’s my list of shows and movies that focus on the juxtaposition of problems and capabilities:

  1. Iron Chef — Food Network — chefs have one hour to cook 5 dishes for a panel of judges. A secret ingredient is revealed at the beginning of the show that the chef must use in all of the dishes — all the while competing with another chef.
  2. Mythbusters — Discovery — Two special effects guys with an extensive array of tools and props, set out to confirm or refute several urban legends.
  3. Ramsay’s Kitchen Nightmares — BBC America — a notoriusly irascible chef serves as a consultant to see if he can turn around England’s failing restuarants.
  4. Survivorman — Discovery — a guy drops himself into remote places like wilderness, desert, and snow pack armed only with a camera and a leatherman tool and the clothes on his back. His mission is to get himself out of danger, filming his journey.
  5. America’s Test Kitchen — PBS — culinary experts try out different tools, gadgets, and recipes, sometimes head-to-head.

Movies I can watch over and over again for their testing parallels:

  1. Apollo 13 — astronauts trapped in a failing capsule with only a few days to live.
  2. Super Size Me — a healthy man eats nothing but food listed on McDonald’s menu for breakfast, lunch, and dinner 30 days. What happens to him?
  3. The Matrix — Life is a nothing but a computer simulation — what bugs have you seen in the program?

Seattle CSI Files

Here are the notes I took after Mark Hanf, a detective from Seattle CSI came to speak to ( on 1/18/07:

  • “We are asked to go to different locations”; parallel: think about testing on different computer platforms.
  • “Look up, not just straight ahead”; parallel: change your perspective when thinking of software tests to run.
  • “Look in the garbage; we go into toilets quite a bit”; parallel: software bugs could reside in places we don’t associate with normally having problems.
  • “Proper documentation with photos”; parallel: we often document our tests and report our finding with screenshots.
  • “Can’t be afraid of heights”; parallel: can’t be afraid of testing on new platforms.
  • “Sometimes you have to match the bullet even though the crime is ‘solved'”; parallel: even though you have found the bug, there may be another cause.
  • “Crime scenes might have CS gas residue”; parallel: “we may be digging in an area that complicates our ability to find bugs.
  • Tools: reflective UV imaging screen, forensic stepping plates, sifting screens; parallel: we have special tools as well (inControl, log file tracing, LoadRunner).
  • “We must gather, document, and demonstrate in court that we did everything possible”; parallel: software projects have “bug juries” that we are often called in to testify in front of to make our case.
  • “We study different disciplines: entomology, odontology, etc.”; parallel: we also study different domains… cognitive psychology for usability, brain physiology, and Crime Scene Investigation!
  • “Everybody’s interested in coming in and going right to the dead body”; parallel: We go right for the features that attract us or that are easy to test.
  • “Detectives should cut their own path to an outside crime scene”; parallel: there is more than one way to reproduce or find a software bug.
  • “You get to the scene, are briefed in an initial walkthrough”; parallel: we have client kick-off meetings that tell us what to focus on and where bugs may likely be hiding.
  • “Footwear impressions and fingerprints are there whether we see them or not”; parallel: same is true for software defects… they are almost always hidden.
  • “Sometimes you’re concerned about the floor, but can’t deal with it then and there”; parallel: bugs mask other bugs… we’re concerned about one feature but may not have time to test it right then.
  • “Take photos with scale and without scale”; parallel: when filing a bug, think about its impact not just to the user but on other programs on the system.
  • “Juries expect a lot more, so in some cases, we have to entertain (re: animation) as well as inform”; parallel: sometimes filing a bug is not enough, we have to be an advocate for what we find.
  • “Defense attorneys could discount elements of our case, so we have to be thorough and careful”; parallel: same is true when we deal with programmers we have to anticipate scrutiny.
  • Photogrammetry‚ a series of digital photographs in succession; parallel: we have mouse click and keystroke recording tools to document the repro of bugs.
  • Talked about how a boyfriend/girlfriend got into a fight and then violence happened; parallel: we develop user stories and scenarios to test for bug pathologies in software.
  • “We can’t say this is what happened, but we can give a logical range of possibilities”; parallel: we’re not always sure what the fault is, but we can suggest possibilities.
  • Projectiles go through glass and leave different signatures; parallel: same is true for bugs… programs leave different signatures on how they use memory or install files.
  • “We have to do presumptive tests sometimes (like the bullet through rubber)”; parallel: we also have to check our basic perceptions to make sure that a bug is really what we think it is.
  • “We take elimination fingerprints to rule out different suspects”; parallel: we do follow-up tests or peripheral tests to rule out other causes.
  • “Keep an open mind… don’t make your evidence fit your theory”; parallel: be mindful of your biases… don’t be fooled into thinking that this is a bug you’ve seen before.
  • “There is a high cost for processing evidence, homicides get priority”; parallel: there is a cost to doing tests… high risk features that lead to crash, hang or data loss get priority.
  • “Temporal evidence: fingerprints can last a long time… might have been there from months before”; parallel: this is the Primacy Bias… a bug might have started weeks ago and shown itself now.
  • Harris vs United States, 1947: Only human failure to find it, study and understand it, can diminish its value; parallel: exactly the same for software testing.
  • “Two heads are better than one; parallel: paired testing, “fresh eyes find bugs.”
  • Staff: Team Lead, Sketch preparer, Photographer, Recorder, Specialists; parallel: Team Lead, Recording Tools, Subject Matter Experts.
Scroll to top
Close Bitnami banner