Writing Test Plans

Everyone knows what Fedora QA does. “They’re the testers. They test stuff.” Sure, that’s true. But.. what, exactly, do we test? And how?

That’s where Test Plans come in. A Test Plan is a document that tells the testers how to test something.

A good software Test Plan should at least answer these four questions:

  1. What special hardware / data / etc. is needed to test this (if any)?
  2. How do you prepare your system to test this? What packages
    need to be installed, config files edited, etc.?
  3. What specific actions do you perform to check that the software is
    working like it’s supposed to?
  4. What are the expected results of those actions?

Your answers can be short: “get a bluetooth keyboard and yum install bluez-gnome newer than 0.25” answers question 1 and 2 just fine. But they need to be complete and explicit: “Get an appropriate keyboard and install the new packages” is not helpful.

Questions 3 and 4 are answered by Test Cases. A Test Case is a single, specific thing to test – a list of actions to perform, and the expected results of those actions. Depending on how complicated a package or feature is, you could have one Test Case or a whole bunch of them.

While writing a Test Plan, pretend that you have an intern whose job it is to test Fedora releases. This brave, brave soul has a few years’ experience as a Fedora user and basic familiarity with using the commandline to configure stuff, install packages, and so on.

Your job is to tell this person how they can test your favorite package or your cool new feature, so they can either a) tell their friends about how great it is, or b) tell you (via bugzilla) if it breaks.

Here’s an example Test Case for Banshee, originally written by our own Balaji Gurudass:


Description:

    Test loading a Local File in Banshee.

Actions:

  1. Start Banshee
  2. Choose “Import Media” from the “Media” menu
  3. Select “Local Files” from the import source dropdown and click the “Import Media Source” button
  4. Select the music file and click Open
  5. Find the song in the music library and play it.

Expected Results:

  1. Banshee starts successfully
  2. The “Import Media” menu item brings up the “Import Media to Library” dialog
  3. The chosen music file is imported into the Banshee library
  4. The music file can be played and the tracker moves along and shows time elapsed.

This gives us a clear, easy-to-follow way for anyone to prove that a new banshee package does (or doesn’t) work like it’s supposed to. Or, at least, one small part of it.

So there’s an overview of how to write test plans. Currently we’re keeping Test Plans in the wiki. There’s a section in each Feature Page for a Test Plan, and if you wanted to write a Test Plan for a specific package (or set of packages), you could add a wiki page named “QA/[package] Test Plan”. For example, here’s QA/Banshee Test Plan.

Next time, I’ll talk about why I think this stuff is so important.

2 thoughts on “Writing Test Plans

  1. Shouldn’t the numbers in the expected results match the test step they correspond to? In the example above, playing the song is Step 5, but the expected result for Step 5 is Expected Result 4. It’s not a big deal here because it’s obvious, but for more complex test cases, it may not be obvious which result goes with which test step.

    • Yeah, you’re totally right – it’s a very good idea to have the expected results match up with the actions. The original plan that Balaji wrote did that correctly, and I messed it up.
      Still, I prefer that the test cases be easy to read (and write), rather than laying down iron-clad requirements like THERE MUST BE EXACTLY THE SAME NUMBER OF ITEMS IN BOTH LISTS – sometimes that might not make sense.

Leave a reply to orangejulius Cancel reply