The Computer Society: The Age of Miracle Chips (Feb, 1978)
Here are some articles from a 1979 Time magazine special issue focusing on computers called “The Computer Society”
The Age of Miracle Chips – Explores possible the possible effect of computers upon society including possible economic and social upheaval.
Science: The Numbers Game – Covers the history of computers as well as the science and technology behind designing and producing them.
Business: Thinking Small - Discusses the computer industry, markets and the potential effects of computers the upon business world.
Living: Pushbutton Power – Explores computer uses in the home, school and hospital.
Time Magazine Gets a PDP-11 – Short piece by the editor of Time about the features of their new PDP-11 including it’s spell-checker, hyphenator, fonts and graphics capability.
The Age of Miracle Chips
New microtechnology will transform society
It is tiny, only about a quarter of an inch square, and quite flat. Under a microscope, it resembles a stylized Navaho rug or the aerial view of a railroad switching yard. Like the grains of sand on a beach, it is made mostly of silicon, next to oxygen the most abundant element on the surface of the earth.
Yet this inert fleckâ€”still unfamiliar to the vast majority of Americansâ€”has astonishing powers that are already transforming society. For the so-called miracle chip has a calculating capability equal to that of a room-size computer of only 25 years ago. Unlike the hulking Calibans of vacuum tubes and tangled wires from which it evolved, it is cheap, easy to mass produce, fast, infinitely versatile and convenient.
The miracle chip represents a quantum leap in the technology of mankind, a development that over the past few years has acquired the force and significance associated with the development of hand tools or the discovery of the steam engine. Just as the Industrial Revolution took over an immense range of tasks from men’s muscles and enormously expanded productivity, so the microcomputer is rapidly assuming huge burdens of drudgery from the human brain and thereby expanding the mind’s capacities in ways that man has only begun to grasp. With the chip, amazing feats of memory and execution become possible in everything from automobile engines to universities and hospitals, from farms to banks and corporate offices, from outer space to a baby’s nursery.
Those outside the electronic priesthood often have trouble grasping the principles of the new microtechnology or comprehending the accomplishments of the minuscule computers. The usual human sense of scale, the proportion between size and capability, the time ratio assumed between thought and action, are swept into a new and surreal terrain. Consequently, people tend to anthropomorphize the computer; they are superstitious about it. In 2001: A Space Odyssey, the companionable computer HAL turns rogue in outer space and methodically begins assassinating its masters. In a B-movie called Demon Seed, the world’s most advanced computer actually impregnates a scientist’s wife, played by Julie Christie; it is so smart that it yearns to be aliveâ€”and scarily succeeds. Some manufacturers of computer games have discovered that people are disconcerted when the computer responds instantly after the human has made his move. So the computers have been programmed to wait a little while before making countermoves, as if scratching their heads in contemplation.
A fear of intellectual inadequacy, of powerlessness before the tireless electronic wizards, has given rise to dozens of science-fiction fantasies of computer takeovers. In The Tale of the Big Computer, by Swedish Physicist Hannes Alfven, written under the pen name Olof Johannesson, the human beings of today become the horses of tomorrow. The world runs not for man but for the existence and welfare of computers.
Other scientists too are apprehensive. D. Raj Reddy, a computer scientist at Pittsburgh’s Carnegie-Mellon University, fears that universally available microcomputers could turn into formidable weapons. Among other things, says Reddy, sophisticated computers in the wrong hands could begin subverting a society by tampering with people’s relationships with their own computers-instructing the other computers to cut off telephone, bank and other services, for example. The danger lies in the fast-expanding computer data banks, with their concentration of information about people and governments, and in the possibility of access to those repositories. Already, computer theft is a growth industry, so much so that the FBI has a special program to train agents to cope with the electronic cutpurses.
Dartmouth College President John G. Kemeny, an eminent mathematician, envisions great benefits from the computer, but in his worst-case imaginings he sees a government that would possess one immense, interconnecting computer system: Big Brother. The alternative is obviously to isolate government computers from one another, to decentralize them, to prevent them from possibly becoming dictatorial. But that would require considerable foresight, sophisticationâ€”and possibly a tough new variety of civil rights legislation.
Some of the most informed apprehensions about computers are expressed by Professor Joseph Weizenbaum of M.I.T.’s Laboratory for Computer Science. Human dependence on computers, Weizenbaum argues, has already become irreversible, and in that dependence resides a frightening vulnerability. It is not just that the systems might break down; the remedy for that could eventually be provided by a number of back-up systems. Besides, industrialized man is already vulnerable to serious dislocations by breakdownsâ€”when the electrical power of New York City goes out, for example. Perhaps a greater danger, says Weizenbaum, lies in the fact that “a computer will do what you tell it to do, but that may be much different from what you had in mind.” The machines can break loose from human intentions. Computers, he argues, are infinitely literal-minded; they exercise no judgments, have no values. Fed a program that was mistaken, a military computer might send off missiles in the wrong direction or fire them at the wrong time. Several years ago, Admiral Thomas Moorer, then Chairman of the Joint Chiefs of Staff, told a Senate committee: “It is unfortunate that we have become slaves to these damned computers.”
Some social critics are worried that a democratization of computers, making them as common as television sets are today, may eventually cause human intellectual powers to atrophy. Even now, students equipped with pocket calculators have been relieved of having to do their figuring on paper; will they eventually forget how to do it, just as urban man has lost so many crafts of survival? Possibly. But the steam engine did not destroy men’s muscles, and the typewriter has not ruined the ability to write longhand.
Certain pre-computer skills should be taught so that they do not vanish. But as Leibniz observed in 1671: “It is unworthy of excellent men to lose hours like slaves in the labor of calculation which could safely be relegated to anyone else if machines were used.” Einstein had to have help with his calculations; they are drone’s work anyway. Says Author Martin Gardner (Mathematical Carnival): “There is no reason why a person should have to sit down and compute the square root of seven. The computer is freeing the individual for more interesting tasks.”
The rapid proliferation of microcomputers will doubtless cause many social dislocations. But the hope is that the burgeoning technology will create an almost limitless range of new products and services and therefore a great new job market. Though one expert estimates that it would take the entire U.S. female population between ages 18 and 45 to run the nation’s telephone system if it were not computerized, Ma Bell now employs more people than it did when its first automatic switching service was introduced.
All of the prodigies of technology leave many people not only nostalgic for simpler times but alarmed by the unknown dangers that “progress” may bring with it. Those who first used fire must have terrified their generation. Practically any breakthrough in knowledge carries with it the possibility that it will be used for evil. But with microcomputers, the optimists can argue an extremely persuasive case. The Industrial Revolution had the effect of standardizing and routinizing life. Microtechnology, with its nearly infinite capacities and adaptability, tends on the contrary toward individualization; with computers, people can design their lives far more in line with their own wishes. They can work at terminals at home instead of in offices, educate themselves in a variety of subjects at precisely the speed they wish, shop electronically with the widest possible discretion. Among other things, microtechnology will make the mechanism of supply and demand operate more responsively; customers’ wishes will be registered at the speed of light.
Some, like Sociologist Seymour Martin Lipset, envision a “more egalitarian society” because of the computer. Transferring so much work to the machines, thinks Lipset, may produce something like Athenian democracy; Athenians could be equal because they had slaves to do their work for them.
Says Isaac Asimov, the prolific author and futuristic polymath: “We are reaching the stage where the problems that we must solve are going to become insoluble without computers. I do not fear computers. I fear the lack of them.” Many people have great expectations and doubts about the new technology, especially in a century when they have felt themselves enslaved and terrorized by the works of science. Stewart Brand, creator of the Whole Earth Catalog, argues for a longer perspective: “This is a story that goes back to the beginning of tool-using animals, back to the rocks the earliest man picked up in Africa. As soon as he started picking up rocks, his hands started changing, his brain started changing. Computers are simply a quantum jump in the same co-evolutionary process.”
There seems little doubt that life in the U.S., then in the rest of the industrial world and eventually all over the planet, will be incalculably changed by the new microtechnology. In the following pages of this special section, Time explores the changes that are likely to come and those that have already occurred. We also explain the recondite world of microcomputers, how they work, how and where they are made, and look far ahead to a future when the distinction between man and the wondrous device he has created may begin to blur.