Testing for Professional Software Testers

Reviewed by Conrad Weisert, November 5, 2003

Rodger D. Drabick: Best Practices for the Formal Testing Process,
2004, Dorset House, ISBN 0-932633-58-7, 285 pages.

Not about software testing in general

This book is aimed not at software developers but at professional software testers, people who make their living testing software. The book calls them "test engineers". It addresses the final stage of testing finished software, called:

  • acceptance testing,
  • or, by the author's definition, alpha testing,
  • or, by an extremely narrow definition quality assurance

Reviewer's disclaimer

I didn't read the detailed sections in their entirety, because I'm not knowledgeable in this highly specialized area. I'm offering these comments as guidance for any prospective reader who might misunderstand the intended scope.

I'll be glad to append comments from anyone who has deeper experience in formal testing.

Unlike the classic by Glenford Myers1, this book does not address the testing done by the developers, including:

Furthermore there's little recognition of the impact on testing of:

That's a defensible point of view for final-stage testing. We don't care how the thing was built or how it's structured internally. We just care whether it satisfies specifications.

Metamethodological presentation

The author presents his recommended practices in the form of flow diagrams. He identifies them as "IPO" (input-process-output) diagrams, but they're actually layered dataflow diagrams, in a somewhat unorthodox style.

Obviously it took a lot of detailed work to develop these diagrams. It also takes considerable effort to understand them. Presumably those efforts are justified by the rigor of the process descriptions and the lack of ambiguity. The processes struck me as awfully bureaucratic, but they may be appropriate for large, critical applications or for mass-marketed software products.

Requirements confusion

Mr. Drabick recognizes the essential role of requirements in preparing a test plan, but then his view of what constitutes satisfactory requirements documentation is, at best, naive:

"Requirements can be described by means of text, by data-flow diagrams [DeMarco 1978], by object-oriented classes and objects [Yourdon and Argila, 1996], by use cases [Jacobson et al., 1999], or by a variety of other forms (although they should not be presented in verbal form alone). - p. 35
"Hopefully your requirements have been written in use case form [Jacobson et al, 1999], which will make developing the test procedures even simpler." - p. 180

Vendor plugs

The text contains a number of plugs for Microsoft products as tools to be used by the test engineer:
Microsoft product Generic designation
Project, pp 30, 59, 67project management software
Word, pp 32, 41, 115word processor
Excel, p 41spreadsheet processor
PowerPoint, p 191presentation graphics software

While it's acceptable for a book to recommend a commercial product that's especially well suited to the concepts and techniques being presented, all of the above plugs are out of place in a serious professional book. Each of those four products has two or three well regarded competitors. Nothing would have been lost by citing the generic designations, even with a for instance, e.g. "a spreadsheet processor such as Microsoft Excel".

The PowerPoint reference is to a "template" file (actually the collection of diagrams from the text), which one can retrieve from the publisher's web site. Not every reader of this book has Microsoft PowerPoint; a .pdf (portable document format) file, would be viewable in all mainstream web browsers.

Not Recommended
except for readers involved with final testing of critical software systems in very formal environments

1 -- Glenford Myers: The Art of Software Testing
2 -- Robert Binder considered object-oriented testing worth 1150 pages in Testing Object-Oriented Systems, 1999, Addison Wesley, ISBN 0-201-80938-9.

Return to book reviews.
Return to IDI home page.