- The Washington Times - Monday, December 31, 2001

Come the end of the year, I find myself reflecting on cosmic topics, such as "Whither?" and "How?"
Technologically speaking, the most cosmic subject we have is computers. They're going nuts. Regarding which: In computing there is what is known as Moore's Law, named for Gordon Moore, one of the founders of Intel. It states that the number of transistors that can be put on a computer chip doubles roughly every 18 months.
This is exponential growth, yet the law has held true for a long time and seems to be continuing. Generally speaking, the power of a computer is proportional to the number of transistors it contains.
Even to people not particularly interested in computers, it is obvious that the machines are getting better fast. They run at higher speeds, have larger hard disks, more memory and so on. We get used to them. They're just computers.
But they're not just computers. They drive everything else. This isn't necessarily obvious, but it is true. The question is to what extent they do it.
A few years back, I was in Atlanta to write a story on the F-22 fighter for Air & Space magazine. I talked to an old-hand engineer in testing who had worked on planes for decades. With earlier aircraft, designed more or less by hand, he said you flew the prototype and hoped nothing important broke off. Quite possibly it didn't fly anything like the way you had hoped it would. So it was fly, redesign, fly, redesign, until it more or less worked.
Then, he said, the computer simulations improved. Now you design the plane, "fly" it in simulation in a massive computer, redesign it before it actually exists, and then build it. The result was is that there are few surprises. The physical airplane might differ from the simulation by small amounts, but no major rework should be needed.
The massive reliance of industry on computers has become almost total. Many things we take for granted just wouldn't happen without them. On another magazine story years back now, I covered the design of the Boeing 777. Boeing designed it entirely in huge mainframes with a couple thousand computer terminals attached. You could get blueprints if you wanted them, but the entire aircraft was in the computers. It wasn't anywhere else. This is now standard.
The design software (CATIA, if you are a serious aerogeek) did things like check for conflicts. For example, if you were an engineer working on the wiring for the brute, you might want to run a cable from one place to another. A structural guy might have put a bulkhead where you thought you would run your cable. The software would check to be sure that there was an unobstructed path. This isn't just a minor convenience. On an airplane that huge and complex, you could easily end up with thousands of interferences that would have you tearing things out and fixing them forever.
What it comes to is that we rely profoundly now on computers to handle complexity that otherwise would be beyond our capacities. One of the things we couldn't do without computers is to design them. For example, consider a CPU (central processing unit, the "brain" of your desktop machine) with 50 million transistors. How do you figure out how to wire them together so that wires don't get crossed, so to speak? How do you test the design when you think you have it finished?
You do it on computers, using special design software. Then, when you have the new chip up and running, you use it to design the next, more-complex generation of chips. Designing and verifying designs is so complex that, as a designer at Intel once told me, "The ultimate limitation may not be the impossibility of making smaller transistors, but of understanding and testing what we design." So far we haven't reached that limit, and maybe won't.
What it comes down to is that just about everything modern humanity does depends on these machines. Your car was designed on computers. The telephone system amounts to a big distributed computer. Basic scientific advances increasingly depend on huge computers that people seldom hear about. In an astonishingly short time, these machines have become the bedrock of our civilization.
I don't think this is cause for worry. The things aren't going to fail suddenly and leave us in loincloths. But their advent is probably the most profound historical change since Gutenberg created the printing press in the mid-1400s, and has happened in the merest handful of decades.
To reach Fred Reed, e-mail him at [email protected]

LOAD COMMENTS ()

 

Click to Read More

Click to Hide