A Short History of Computing (Jul, 1978)

>>|
Next >>
9 of 9
>>|
Next >>
9 of 9

A Short History of Computing

A few weeks ago a master’s degree candidate in computer science confided, with an embarrassed laugh, that he had never seen a computer. His experience with the machines of his chosen vocation had consisted entirely of submitting punched cards through a hole in a wall and later getting printed results the same way. While his opportunities to see equipment are restricted due to his student status, there are also thousands of working programmers and analysts using large scale equipment who have no contact with existing hardware and will never have a chance to see any first or second generation computers in operation.

This is in sharp contrast with the way programmers worked in the late 1950s and early 1960s. Before 1964, when multiprogramming computers were introduced, the typical programmer had opportunities to come in contact with the computer if he or she wanted to do so. Prior to 1960, in fact, most programmers actually operated the machine when debugging their programs. These people learned of the computer as a physical device; the current programmer is more likely to think of it as a vague logical entity at the other end of a terminal. Thus, many large system programmers have the rare distinction of using a tool without knowing how it works or what it looks like. This is in spite of the fact that many important computer developments have occurred within the average programmer’s lifetime.

However, in the past year or two, dramatic reductions in the cost of minicomputer components and the advent of the microcomputer have returned the hands-on computer to respectability in two ways. First, it is now possible to justify hands-on debugging on a small computer, since the hourly rate of the programmer is higher than that of the machine. Second, the decreasing cost of home computing has fostered the birth of a new class of “renaissance programmers”: people who combine programming expertise with hardware knowledge and aren’t afraid to admit it. Renaissance programmers can learn much from the lessons of computer history; simple and inelegant hardware isn’t necessarily best, but it’s frequently cheapest.

In short, the stored program computer became a necessary tool only recently, even though various mechanical aids to computation have been in existence for centuries.

One of the first such aids was the abacus, the invention of which is claimed by the Chinese. It was known in Egypt as early as 460 BC. The Chinese version of the abacus (as shown in photo 1) consists of a frame strung with wires containing seven beads each. Part of the frame separates the topmost two beads from the lower five. The right-hand wire represents units, the next tens, the next hundreds, and so on. The operator slides the beads to perform addition and subtraction and reads the resulting sum from the final position of the beads. The principle of the abacus became known to Roman and early European traders, who adopted it in a form in which stones (called by the Latin calculi, hence the word “calculate”) are moved around in grooves on a flat board.

The use of precision instruments dates back to the Alexandrian astronomers. Like the mathematics of the period, however, the development of scientific instruments died away with the demise of the Alexandrian school. The Arabs renewed interest in astronomy in the period between 800 and 1500 AD, and it was during this time that the first specialists in instrument making appeared. The center of instrument making shifted to Nuremberg, beginning about 1400. By the middle of the 16th Century, precise engraving on brass was well advanced due in part to the interest in book printing.

Calendrical calculators used for determining the moon’s phases and the positions of the planets crop up in all the major periods of scientific thought in the past two thousand years. Parts of a Greek machine about 1800 years old, apparently used to simulate the motions of the planets, were found in 1902 in the remains of a ship off the island of Antikythera. The gears of the machine indicate amazing technical ability and knowledge. Later calendrical calculators, which were usually of the type in which two or more flat disks were rotated about the same axis, came to include a means of telling time at night by visually aligning part of the Big Dipper with the pole star.

Trigonometric calculators, working on a graphical principle, were in use in the Arabic period. Such calculators were used mainly to determine triangular relationships in surveying. The popularity of this device was renewed in 14th Century Europe; in fact, calculating aids of all kinds grew rapidly in popularity as well as in scope from this time onward, largely due to the difficulty of the current arithmetic techniques. Napier was continually seeking ways to improve computational methods through his inventions. One such invention, “Napier’s bones,” consisted of a number of flat sticks similar to the kind now used in ice cream bars. Each stick was marked off into squares containing numbers. To perform calculations, the user manipulated the sticks up and down in a manner reminiscent of the abacus. Of particular interest is the fact that Napier’s invention was used for general calculation at a time when many other devices were used for the specific determination of one measurement, such as the volume of liquid in a partly full barrel, or the range of an artillery shot.

Pascal invented and built what is often called the first real calculating machine in 1642 (shown in photo 2). The machine consisted of a set of geared wheels arranged so that a complete revolution of any wheel rotated the wheel to its left one tenth of a revolution. Digits were inscribed on the side of each wheel. Additions and subtractions could be performed by the rotation of the wheels; this was done with the aid of a stylus. Pascal’s calculator design is still widely seen in the form of inexpensive plastic versions found in variety stores.

In 1671 Leibniz invented a machine capable of multiplication and division, but it is said to have been prone to inaccuracies.

The work of Pascal, Leibniz, and other pioneers of mechanical calculation was greatly facilitated by the knowledge of gears and escapements gained through advances in the clock. In the 13th Century, a clock was devised for Alfonso X of Spain which used a falling weight to turn a dial. The weight was regulated by a cylindrical container divided into partitions and partly filled with mercury. The mercury flowed slowly through small holes in the partitions as the cylinder rotated; this tended to counterbalance the weight. By the 15th Century, the recoil of a spring regulated by an escapement had made its appearance as a source of motive power. Gear trains of increasing complexity and ingenuity were invented. Clocks could now strike on the hours, have minute and second hands (at first on separate dials), and record calendrical and astronomical events. Gears opened the door to wonderful automata and gadgets such as the Strasbourg clock of 1354. This device included a mechanical rooster which flapped its wings, stretched its metal feathers, opened its beak and crowed every day at noon. Later, important improvements in timekeeping included Galileo’s invention of the pendulum; and the accurate driving of a clock without weights or pendulum which led to the portable watch.

Although mechanical and machine shop techniques still had a long way to go (consider the 19th Century machinist’s inability to fit a piston tightly into a cylinder), the importance of mechanical inventions as aids to computation was overshadowed by electrical discoveries beginning with the invention of the battery by Volta in 1800.

During the 1700s, much experimental work had been done with static electricity. The so-called electrical machine underwent a number of improvements. Other electrical inventions like the Leyden jar appeared, but all were based on static electricity which releases very little energy in a very spectacular way. In 1820, following Volta’s discovery, Oersted recognized the principle of electromagnetism that allowed Faraday to complete the work leading to the dynamo, and eventually to the electric motor. It was not until 1873, however, that Gramme demonstrated a commercially practicable direct current motor in Vienna. Alternating current (AC) was shown to be the most feasible type of electric power for distribution, and subsequently the AC motor was invented in 1888 by Tesla. The value of electric power for transportation was quickly recognized and employed in tramways and electric railways. This led to improvements in methods for controlling electricity. Electric lighting methods sprang up like weeds during the latter half of the 19th Century. The most successful were due to the efforts of Swan in England and Edison in the United States. Work on electric lighting, the telegraph and the telephone led to the wonder of the age: radio. In 1895, Marconi transmitted a radio message over a distance of one mile, and six years later from England to Newfoundland.

As a consequence of the rapid growth of interest in the radio, much work was done on the vacuum tube. Lee de Forest discovered the principle of the triode in 1907. Until the development of the transistor, the vacuum tube was the most important device in computer technology due to its ability to respond to changes in electrical voltage in extremely short periods of time. The cathode ray tube, invented by William Crookes, was used in computers for a few years prior to 1960. It faded temporarily from view but returned in 1964 due to advances in technology that improved its economic feasibility as well as its value as a display tool. In 1948 Bardeen, Brattain and Shockley developed the transistor, which began to replace the vacuum tube in computers in 1959. The transistor has many advantages over the vacuum tube as a computer component: it lasts much longer, generates much less heat, and takes up less space. It therefore replaced the vacuum tube, only to fall prey in turn to microminiaturization. Of course, the transistor principle didn’t go away, but the little flying saucers with three wires coming out of their bases did.

Oddly enough, one of the most fundamental devices in the early history of computing predates the electronic computer by more than two hundred years. The punched card was first used to control patterns woven by the automatic loom. Although Jacquard is commonly thought to have originated the use of cards, it was actually done first by Falcon in 1728. Falcon’s cards, which were connected together like a roll of postage stamps, were used by Jacquard to control the first fully automatic loom in France, and later appeared in Great Britain about 1810 (see photo 3). At about the same time, Charles Babbage began to devote his thinking to the development of computing machinery. Babbage’s first machine, the Difference Engine, shown in photo 4, was completed in 1822 and was used in the computation of tables. His attempts to build a larger Difference Engine were unsuccessful, even though he spent £23,000 on the project (£6,000 of his own, and £17,000 of the government’s).

In 1833 Babbage began a project that was to be his life’s work and his supreme frustration: the Analytical Engine. This machine was manifestly similar in theory to modern computers, but in fact was never completed. During the forty years devoted to the project, many excellent engineering drawings were made of parts of the Analytical Engine, and some parts of the machine were actually completed at the expense of Babbage’s considerable personal fortune. The machine, which was to derive its motive power from a steam engine, was to use punched cards to direct its activities. The Engine was to include the capability of retaining and displaying upon demand any of its 1000 fifty-digit numbers (the first suggestion that a computing machine should have a memory) and was to be capable of changing its course of action depending on calculated results. Unfortunately for Babbage, his theories were years ahead of existing engineering technology, but he contributed to posterity the idea that punched cards could be used as inputs to computers.

Herman Hollerith put punched cards to use in 1890 in his electric accounting machines, which were not computers, but machines designed to sort and collate cards according to the positions of holes punched in the cards (see photo 5). Hollerith’s machines were put to effective use in the United States census of 1890.

In 1911, the Computing-Tabulating-Recording Company was formed, which changed its name to International Business Machines in 1924. In the period between 1932 and 1945 many advances were made in electric accounting machines, culminating in 1946 with IBM’s announcement of the IBM 602 and 603 electronic calculators, which were capable of performing arithmetic on data punched onto a card and of punching the result onto the same card. It was Remington Rand, however, who announced the first commercially available electronic data processing machine, the Univac I, the first of which was delivered to the US Census Bureau in 1950. In 1963, just thirteen years after the beginning of the computer business, computer rental costs in the United States exceeded a billion dollars.

Univac I was not the first computer, even though it was the first to be offered for sale. Several one of a kind computers were built in the period between 1944 and 1950 partly as a result of the war. In 1939 work was begun by IBM on the Automatic Sequence Controlled Calculator, Mark I, which was completed in 1944 and used at Harvard University (see photo 6). Relays were used to retain numbers; since relays are electromechanical and have parts that actually move, they are very slow by modern standards.

In 1943, Eckert, Mauchly and Goldstine started to build the ENIAC (Electronic Numerical Integrator and Calculator), which became the first electronic computer using vacuum tubes instead of relays (see photo 7). The next year John von Neumann became interested in EN I AC and by 1946 had recognized a fundamental flaw in its design. In “Preliminary Discussion of the Logical Design of an Electronic Computing Instrument,” von Neumann pointed out the advantages of using the computer’s memory to store not only data but the program itself. Machines without stored program capabilities were limited in scope, since they had to be partly rewired in order to solve a new problem (as was the case with EN I AC). This process sometimes took days during which time the machine could not be used. If rewiring of such machines was to be avoided, instructions had to be entered and executed one at a time, which greatly limited the machine’s decision making capabilities. Machines with stored program capabilities automatically store not only numeric data but also the program (which looks like numbers and can be treated like numbers) in memory. In short, stored program instructions can be used to modify other instructions, a concept that leads to programs which can modify themselves. It is the von Neumann stored program concept which is universally used in modern computers from the smallest microcomputer to the largest number crunchers.

The growth of the missile industry in the 1950s greatly stimulated the progress of computers used for scientific work. The nature of missile data handling at that time was such that work loads were very high during the week or so after a firing and virtually nonexistent in between. Computers were too expensive to leave idle, which led managers to look for other work for the machines. Business data processing grew from these roots to its present status, accounting for the lion’s share of machine usage today.

The latter part of 1959 saw the arrival of the transistorized computer. As a consequence of this innovation, air conditioning and power requirements for computers were reduced. Several new computers in that year were announced by IBM, Control Data Corporation, General Electric, and other manufacturers. Among the IBM announcements were the 7070 general purpose computer; the 7090, a high speed computer designed for a predominance of scientific work; the 1401, a relatively inexpensive computer aimed at the medium sized business and the 1620, a low priced scientifically oriented computer. The fantastic growth of the computer field continued through 1961 and 1962 with the announcement of more than 20 new machines each year. In 1963, continuing the family line from the grandfather 704 (as shown in photo 8), the IBM 7040 was announced. This machine embodied many of the features of the 7090 at a reduced cost. In the same year at least 23 other computers were announced by several different manufacturers. In 1964, IBM announced the 7010, an enlarged and faster version of the 1410, and the 360, which came in many different sizes and embodied many features not found in previous computers. Control Data Corporation announced the 6600, and General Electric their 400 series. The IBM 360/370 is typical of a trend in computer manufacturing which is currently followed by most manufacturers: upward compatibility. In the years prior to 1965, every manufacturer spent huge sums of money on research and programming support for several types of computers; several went out of business doing so. Likewise, computer users spent a lot of money to develop their systems for a particular computer only to find it had been superseded by a faster, less expensive machine. As a consequence, the deadly management decision of the period was, “Do we get the cheaper machine and spend the money on reprogramming, or do we risk staying with an obsolete computer and losing our programmers to the company across the street?”

Current developments point to a new trend away from the bigger machines. The combination of lower prices for components and programmable read only memories is attracting many manufacturers to the field of minicomputers and microcomputers. The current trend is clearly toward the personal computer, with TV game microprocessors leading the way.”

8 comments
  1. Andrew L. Ayers says: August 5, 20119:33 am

    Photo 7: Is that really the ENIAC? If it is, it’s a view that I have -never- seen published anywhere else (and I have read and own an absolute ton of books on computer history). Take a look at the pictures on Wikipedia, for example:

    http://en.wikipedia.org…

    Do you see any of the clean lines of Photo 7 in the article in any of those pictures? Take a look around the museum:

    http://www.seas.upenn.e…

    Where are the cables? Where are rolling plugboards? I did a Google Image Search for “ENIAC”, and got a ton of pictures, all looking similar (cables, plugboards, rat nests); of those pictures (on the first “page”), only two looked anything remotes like Photo 7; one was another copy of Photo 7:

    http://www.nordhornanti…

    The other was this one:

    http://alfredo.octavio….

    It seems strange to me these two would stand out – I know that ENIAC was moved around a few times, and upgraded over the years; if anything, these two images would be from its very latter years before it was decommissioned. I just find it strange that there would be so few images of it in this configuration (with tape drives and a console)…

    Hopefully someone else here can shed some light on this…

  2. Andrew L. Ayers says: August 5, 20119:42 am

    Actually – do a Google Image Search on “Univac”:

    http://en.wikipedia.org…

    Another Eckert and Mauchly machine (essentially the first commercialized computer for businesses and a follow-on to ENIAC). Notice how it (as well as the family of machines) looks really similar to Photo 7?

    In fact, I am almost certain that this earlier image link I posted:

    http://alfredo.octavio….

    …is actually a UNIVAC – you can see similar tape drives in the UNIVAC GIS results.

    Was this a photo mixup by the Byte editors…?

  3. Charlene says: August 6, 201110:45 am

    Under normal practices the writer of an article like this one wouldn’t even see the images the magazine was intending to run with his work until after publication, so Reid-Green shouldn’t be held responsible. Unfortunately misidentified images are common in image banks even now, but thirty years ago the problem was far worse; some doozers, like the image I found in the CP archives of a horse labelled “US President Richard M. Nixon”, were obvious, but something like this? It’s unlikely they would even question it.

  4. Andrew L. Ayers says: August 6, 201112:06 pm

    @Charlene: I was actually thinking it might be the other way around; that perhaps the author supplied the wrong images, and that the Byte editorial staff wasn’t responsible. I tend to hold Byte of this era to a higher standard, though I am willing to entertain the idea that there simply was a screwup in the selection of images due to mis-labeling or other means. We’ll likely never know; anything is speculation at this point (and pointless speculation at that, probably).

    At the same time, I am only speculating that the image is that of a Univac – but not likely a Univac I, as the console of the Univac I (shown on the last page) doesn’t match that in Photo 7; I suspect it’s of a later model Univac – or possibly, even probably, of some other manufacturer’s machine (which – I don’t know right now).

  5. Andrew L. Ayers says: August 6, 20111:18 pm

    You know – the more I look at Photo 7, the more I wonder just -what- computer it is; it seems to be a very singular image of a computer.

    I am fairly certain it is not the ENIAC, as the ENIAC was a plug-board programmed computer, and AFAIK, didn’t have a console like the lady is sitting at. Also, if you note in the background on the left of the photo, there appears to be a wide-format paper tape reader of some sort.

    For the life of me, I am unable to find any similar photos online or in my various computer history book collection (which consists both of “current” books looking back; aka history – as well as contemporary books of the 1950s and 1960s regarding computer technology) – of that machine, the console, or the tape reader. It’s an utter mystery just what machine it is.

    I sincerely hope someone comes along here and puts my mind at rest (alright, this isn’t going to keep me up at night, but it is interesting to me).

    @Charlene, or anyone else: Does the clothing and/or hairstyle of the woman seem to indicate to you the 1950s? What about the country (ie – is this an American hairstyle/mode of dress – or European, or British)? Does it indicate some other era?

  6. JMyint says: August 6, 20111:28 pm

    Andrew photo number 7 isn’t the ENIAC nor is it a Univac or any other Eckert- Mauchy computer (for a little bit I thought it might be the assembled BINAC). The control station is wrong for a Univac. It turns out that it is the IBM SSEC of 1948.

    http://www.computerhist…

    A close up of the control station.

    www-03.ibm.com/ibm/hist…

  7. JMyint says: August 6, 20111:44 pm

    Here are more pictures of the SSEC in operation. It was built into the lobby of the IBM headquarters in New York City so that passers by a visitors could see it working.

    http://www.columbia.edu…

  8. Toronto says: August 6, 20118:26 pm

    JM: I’m tearing up – that’s just so beautiful.

    Four hundred pound rolls of seven inch wide punched manilla paper? I’d hate to have been the operator!

Submit comment

You must be logged in to post a comment.