Engineering “Bugs” Can Cost Lives

Is it human hubris or something more?  It’s interesting to look at the two space shuttle disasters and think about how they could have been avoided.  There are accusations that these disasters could have been avoided – that engineers knew of problems but decided to ignore them in lieu of other concerns.

 

The Ancient Fight Against Programming Bugs

mariner_1Bugs are nothing new to computer programming – they’ve been around since programming was even a twinkle in Turing’s eye.  However there have been attempts to reduce the amount of bugs there are, and the impact those bugs have on our daily lives.  I particularly liked this old article from the publication Science regarding a man named Dijkstra who tried to get programmers to change the way they thought about programming – so that their programming styles could have a mathematical basis, thus reducing the number of bugs in any given system.

“For more than 20 years, Dijkstra has been fighting against the kind of programming that inevitably leads to bugs in computer software. To him, the way organizations like NASA program computers is fool-hardy at best, perilous at worst. He believes there is another way, a better way. It involves structuring how a person thinks about programming so that programs themselves acquire a firm mathematical basis. The discipline his work has spawned, called structured programming, has been one of the most important advances in computer software of the past two decades. But it has not come easy.

For his outspoken views, Dijkstra is praised as a visionary by some, condemend as a quixotic dreamer by others. “Nobody remains indifferent,” says his friend Jim Horning, a computer scientist at Xerox’s Palo Alto Research Center. “There are some people in this laboratory who read everything he writes and are extremely grateful for it, and there are others who would not be willing to have him come visit us. He tends to polarize people.”

Part of the controversy centers on the man himself. A prolific and articulate writer, Dijkstra has become something of a conscience for the computing community. In articles, essays, and even satires, Dijkstra has often reminded the field of its faults. In a 1972 address, for example, he denounced, in typically colorful terms, several common programming languages that he believes are critically flawed. “The sooner we can forget that Fortran ever existed, the better,” he said, “for as a vehicle of thought it is no longer adequate: It wastes our brainpower, and it is too risky and therefore too expensive to use.” Another programming language, PL/I, he compared to “a plane with 7,000 buttons, switches, and handles to manipulate in the cockpit. I absolutely fail to see how we can keep our growing programs firmly within our intellectual grip when by its sheer baroqueness the programming language–our basic tool, mind you!–already escapes out intellectual control.”

At the same time, he has had a profound impact on how programming is done, particularly through his theory of structured programming. Structured programming is an attempt to deal with what Dijkstra feels is a programmer’s most insidious problem–sheer complexity. The problem is inherent in the task. All computers, from the chip in a calculator to the computers on board the Columbia, operate under the command of programs stored in their memories. These tell the circuits of the computers what to do–add these two numbers, store this information in that location. If the computer itself, the hardware, is the body of computing, software is its soul.

The complexity of software comes from the dense interrelationships of the instructions in a program. Very large software systems consists of millions of separate instructions, generally written by hundreds of different people. Yet these instructions must dovetail with perfect accuracy. If even a single instruction is wrong, the software system can fail.

No one person can completely understand a system of such complexity. A programmer may understand one part of it. A manager may grasp its outlines. But the system as a whole surpasses human understanding. Many computer scientists claim that these large software systems are the most logically complex things that human beings have ever built.

This complexity has two concrete effects: It causes software for large computers to be expensive and to almost invariably contain errors. Some of the software fiascos of the past have become legendary. The Mariner 1 Venus probe, for example, had to be blown up immediately after its launch in 1962 because of a missing word in its control program. But much more persuasive and important are the countless small programming errors that afflict our computerized society. “There are,” says Dijkstra, “minor annoyances in great multitude–banks erroneously computing your interest, airline reservations getting screwed up, and what we are suffering from now, failures of computer-controlled telephone exchanges to make connections.”

Olson, Steve. “Sage of software.” Science ’84 5 (1984): 74+.

It’s a very interesting topic, and I especially liked the detail of the Mariner 1 Venus probe having to be blown up because of a MISSING WORD.  So crazy!