Some compromises are sensible; others dangerous . . .

Requirements Shortcuts for Packaged Application Software Product System Solutions
© Conrad Weisert, Information Disciplines, Inc., Chicago
1 October 2003

This article may be circulated freely, as long as the copyright notice is included.


A long-standing disagreement

Suppose at the beginning of a system development project, we feel quite sure that we're going to buy an application software product to handle most, if not all, of the computerized functions of the system. We may even have a good idea which product that will be, perhaps based on an impressive vendor's demonstration or on favorable reviews from trusted sources in other organizations.

Two schools of thought have been debated since the emergence of packaged application software products in the 1970s:
Many experienced systems analysts and conservative project leaders maintain that we still need to capture and document the users' requirements just as rigorously as if we were going to develop custom software. Otherwise, we may make a serious mistake that won't be discovered until late in the project, when it may be prohibitively costly to undo.

Extreme advocates of this point of view urge the project to proceed through the analysis phases before even considering the "make or buy" decision, no matter how well-suited a given packaged application product may appear.

On the other hand, many managers and impatient user representatives feel that gathering and documenting requirements will only delay the project, since we already know that the requirements will turn out to call for the chosen product. If we expend effort in preparing formal specifications of system outputs, data definitions, and other details of a complete set of user requirements, it's possible that no one will ever look at all that documentation.

Extreme advocates of this point of view dispense altogether with the disciplines of a life cycle and formal requirements specification, and charter a project simply to install the chosen product.

Which approach to planning and managing the project is more appropriate? Which is quicker? cheaper? safer?

As usual in such controversies both sides score points that we should heed. Advocates can support their views by pointing to actual project experiences, both good and (often over-dramatized) bad. A practical and prudent manager should draw from both points of view and avoid either extreme position.

Risks in bypassing formal requirements

User representatives are unlikely to spot detailed discrepancies between what they need and what a system specification describes. That's a common problem, of course, even when custom software is being developed for them, but it's worse with packaged software products. With in-house (or contract) development, the user representatives remain in contact with the developers. Competent in-house systems analysts are alert to areas of potential misunderstanding, and bring them to the users' attention for resolution. Furthermore, many user representatives have a natural skepticism about software being developed incrementally before their eyes, and are likely to raise issues.

With a product, on the other hand, many user representatives are dazzled by an impressive demonstration and assurances from respected vendor's representatives. They may also fall into the common trap of making tacit assumptions about what any claims processing or factory scheduling system would naturally have to do.

Example: missing data definition

One particularly troublesome area is the data dictionary. Experience shows that misunderstandings about the meanings of data items are a frequent source of unpleasant surprises late in a project, whether the software is to be purchased or developed.

Suppose that certain reports are to contain one or more year_to_date totals. Of course, each such total implies a corresponding item in the database. Now suppose we later discover that what the users really wanted is a rolling total of the previous twelve months.

From the point of view of a naive end user, there's not a lot of difference between the two. The reports and displays look almost identical. Surely, our users assume, changing from one to the other ought to be a pretty minor software change.

We know better. Where the product's database contains a single year-to-date total, we're going to need to store 12 monthly totals. Furthermore, logic scattered through the programs will have to compute the required totals and, at the appropriate time, to rotate the oldest month off the end. All that may well be a trivial change to the requirements specification, but is anything but a minor change to an existing database design and suite of programs.

There's no magic

Users are sometimes misled by vendors' claims of nearly unlimited "expandability". For example, some product literature points out that space has been reserved in the database for user-specified fields that can contain anything we like. The fields may have fixed names like usr1, through usr20, or the user may be able to specify the field names as part of the installation. Some product literature proclaims that we can add our own "business rules" to its "rule base"

All of that sounds tempting, but it falls short of assuring us that a given user-defined field can be inserted into a given report or that a given business rule will be triggered at the appropriate time. We still need to specify exactly what happens and when, before anyone can assess whether or not the product will meet the users' needs.

A typical "software product installation" project may be plagued by dozens of such discoveries: essential business rules that don't fit in any definite part of the software, output data based on information that's never entered into the system, and so on. Projects that were thought to be "too urgent" to bother with specifying rigorous requirements all too often get bogged down in late-stage "discrepancy lists".

Some sensible shortcuts

If we've obtained users' manuals or unusually detailed promotional material for an application software product, the systems analyst may cite, as part of the requirements deliverables, certain sections of those manuals that he or she believes accurately reflect what the users want. For example:

The point here is that we don't want to waste time documenting something that has already been documented by a vendor. Draw upon anything that clearly and unambiguously expresses the users' wishes. However, if we can't cite a specific reference that expresses exactly what the users want, then we have no choice but to create original documentation.

Note that in either case we end up with a complete package of rigorous requirements specifications before making an irreversible commitment to any application software product.

Warning on Purchase-Unfriendly Methodologies

Some recent fad methodologies are oriented exclusively to in-house software development and don't support buying application software at all. That has the unfortunate effect of reinforcing the misguided impression that we needn't bother with rigorous requirements specification when we expect to buy an application software product.

The so-called "Unified Process" exemplifies such methodologies (see Ivar Jacobson: The Unified Software Development Process, Addison-Wesley, ISBN 0-201-57169-2), but note that its creators admit that it's for software development as contrasted with application system development. How often do we begin an application system development project absolutely sure that we're not going to buy a packaged solution?

So does, as we've noted before, extreme programming.


Return to Requirements Guidelines
Return to IDI Home Page