March 25, 2007
NOTE: This article may be reproduced and circulated freely, as long as the copyright credit is included.
John Backus was credited with leading the project in IBM that gave birth to the Fortran language and its first compiler in 1956. Although opinion was and still is strongly divided on whether that was a positive or negative contribution to computing technology, everyone agrees that Fortran exerted a massive influence on our field.
With the passing of John Backus it's timely to look back at the origins and early development of Fortran and it's effects on our profession. These are some of my personal reminiscences, colored by my own background in large-scale computing. I no longer use Fortran and probably never again will, nor do I recommend it to clients or students.
The original idea was to eliminate the professional coder. Fortran would be used by engineers and scientists to solve their own problems directly. That worked fairly well for solving small problems, but whenever a problem became very large and complex those users needed help and called upon the programming staff. Such help was especially needed during testing, since the most common debugging tool then was a printed memory dump in absolute octal.
Before long, professional programmers were coding full-time in Fortran on behalf of end users whose problems were growing more challenging. Controversies arose as to whether someone whose main activity was writing programs in Fortran should be considered a professional programmer. Eventually Federal Civil Service job descriptions had to be clarified to answer that question affirmatively.
Your Grandfather's Fortran
Fortran has gone through a series of major changes, and the Fortran of today is radically different from and much better than the Fortran John Backus's team produced. The comments here about early Fortran's pluses and minuses describe a historically significant language that wouldn't be recognized by today's young programmers.
Early skeptics were sure that the executable code generated by higher-level language compilers had to be very slow compared with code written by a highly-skilled programmer in assembly language. Given the sluggish speed of first-generation computers, execution efficiency was a serious concern. There might not be enough hours in a day to get the work done.
The skeptics were right, but the performance of Fortran programs still surprised them. Backus's team pioneered certain kinds of optimization, such as recognizing and factoring out common subexpressions. They also developed extremely sophisticated register allocation schemes that minimized memory references. Of course, a skilled programmer knew better than to code repetitive expressions in the first place, and would be inclined to use registers efficiently, but Fortran still did surprisingly well and even surpassed the efficiency of code written by average programmers. The argument about higher-level language inefficiency was over.
Machine independence was not one of the designers' objectives. Indeed, they went to some trouble to give the Fortran programmer access to peculiar features of their own company's 704 computer, such as the console sense switches.
There was no character or string1 data type, but the
EQUIVALENCE statement combined with
Boolean masking operations allowed programmers to stuff six-bit (BCI) characters into cells
that the compiler thought contained floating-point numbers. Surprisingly, the normally
scholarly Communications of the ACM published a series of techniques advising
readers how to fool the compiler into doing crude string manipulations. Naturally, those
techniques did nothing sensible under later versions of Fortran.
A peculiarity arose from the designers' decision to store arrays in backward order in memory, a curious choice since the subtractive indexing of the IBM 704 and its successors made it easier and more efficient to loop through array elements stored in forward sequence. That rarely posed a problem except when a programmer needed to mix subroutines coded in different languages, and even then was only a minor irritation.
The nadir of machine dependence was reached with the abortive "Fortran III", which allowed the programmer to intermix executable Fortran statements with 7090 assembly language instructions. Negative reactions and other manufacturer's implementations led to Fortran IV, which lifted the worst machine dependencies and became a candidate for ASA standardization.
The 704 Fortran compiler was a stand-alone program. The coming of the successor machine, the IBM 709 with overlapped I-O channels and it's much faster second-generation solid-state counterpart, the 7090, brought the expectation of and necessity for a batch operating system.
Unfortunately, the huge multipass Fortran compiler had been developed with the assumption that it owned the whole machine configuration and with little attention to localized input-output. To make it work under the proposed SHARE Operating System (SOS) was going to be a massive undertaking. To make matters worse, SOS's linking loader couldn't handle the relocatable binary modules produced by the Fortran compiler. Fortran was hostile to SOS and SOS was hostile to Fortran.
In desperation, IBM threw together a crude operating system, which it called "The Fortran Monitor System" (FMS), containing the Fortran compiler and a "Fortran Assembly Program" (FAP). George H. Mealy2 described FMS as "an operating system in which every instruction knows about every other instruction", an exaggeration, but indeed FMS was a monolithic mess that you couldn't adapt to either new hardware devices or additional software components.
So, for several years3 the large community of Fortran programmers on large IBM computers had to do their work within a clumsy and inflexible operating environment.
The late Alan Perlis, a supporter of Algol, despised Fortran because it violated basic principles of language design. In particular, Dr. Perlis maintained that:
Fortran was loaded with contexts that violated those criteria, as well as a number of strange rules imposed for the convenience of the compiler writers. For example:
K+1was a legal subscript but
Dr. Perlis was expecially frustrated by the rapid acceptance of Fortran, blocking the eventual success of the academic community's favorite, ALGOL, an elegant language with block structure and recursive functions.
No one had codified structured coding in the 1950s, so it's hardly surprising
that Fortran lacked the flow-control constructs we take for granted and relied heavily
GO TO statements for branching. Worst of all was the
ASSIGN statement that modified the target of a
(That ill-advised construct inspired COBOL's notorious
statement, which rendered many programs nearly incomprehensible.)
The shortcomings noted above were minor annoyances compared to the very serious problems a programmer would have in organizing and maintaining a large program. We've already noted the need to repeat constants dozens, sometimes hundreds, of times.
The biggest problem was
COMMON storage. Separately compiled functions and
subroutines could share such data, but the linking loader didn't match the data names. Instead
the compiler, knowing the absolute address of the origin of common storage
774618), assigned addresses
through positions in a list. If one subroutine's
declarations were off by 1 from those of another subroutine, results could be chaotic and
hard to debug. I know of one large project team that designated a secretary whose main
responsibility was to keep the
COMMON declarations synchronized (using card
Fortran was a huge breakthrough in 1956-1957 but by 1961 its shortcomings were widely recognized, and far superior alternatives were appearing. The 1960s saw a succession of new languages, each of which was a strong candidate for replacing Fortran:
But surprisingly, the Fortran user community took little notice. What they already had was working, and they saw no reason to switch.
The Fortran committee in SHARE4, however, looked at PL/I, and then passed resolutions calling on IBM to add many of PL/I's features to Fortran. "Why don't you just use PL/I?" some asked. "Because we're Fortran programmers!"
What's in a name?Some speculated that PL/I would have become the dominant language for scientific/engineering applications if IBM had chosen to call it Fortran VI. Of course that would have doomed it in the business applications community.
. . . the most significant contribution made by FORTRAN is its usage rather than its technology.
Because it was designed so early, better ways have been found to do almost everything that is
currently in FORTRAN.
- Jean Sammet, Programming Languages History and Fundamentals, 1969, Prentice Hall, p. 169
Computer Science is grateful to John Backus and his colleagues for demonstrating the practicality of programming in a higher-level, problem-oriented language. We regret, however, that Fortran was so successful that its use inhibited progress toward better languages. Are today's popular programming languages better or worse than they would have been if Fortran hadn't blazed the trail?
1—The programmer could insert captions and other character-string
constants into printed output via a
2—George Mealy was associated with IBM during the development of OS/360, and later with Harvard University.
3—Fortran IV (1963) ran under the IBSYS-IBJOB system.
4—The user-group organization for users of large-scale IBM (and compatible) computers.
Last modified 26 May 2011
Return to IDI home page