Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
Geyts_The_Road_Ahead_RuLit_Net.docx
Скачиваний:
16
Добавлен:
24.03.2015
Размер:
655.84 Кб
Скачать

2 The beginning of the information age

The first time I heard the term “Information Age” I was tantalized. I knew about the Iron Age and the Bronze Age, periods of history named for the new materials men used to make their tools and weapons. Those were specific eras. Then I read academics predicting that countries will be fighting over the control of information, not natural resources. This sounded intriguing too, but what did they mean by information?

The claim that information would define the future reminded me of the famous party scene in the 1967 movie The Graduate. A businessman buttonholes Benjamin, the college graduate played by Dustin Hoffman, and offers him a single word of unsolicited career advice: “Plastics.” I wondered whether, if the scene had been written a few decades later, the businessman’s advice would have been: “One word, Benjamin. ‘Information.’”

I imagined nonsensical conversations around a future office watercooler: “How much information do you have?” “Switzerland is a great country because of all the information they have there!” “I hear the Information Price Index is going up!”

It sounds nonsensical because information isn’t as tangible or measurable as the materials that defined previous ages, but information has become increasingly important to us. The information revolution is just beginning. The cost of communications will drop as precipitously as the cost of computing already has. When it gets low enough and is combined with other advances in technology, “information highway” will no longer be just a phrase for eager executives and excited politicians. It will be as real and as far‑reaching as “electricity.” To understand why information is going to be so central, it’s important to know how technology is changing the ways we handle information.

The majority of this chapter is devoted to such an explanation. The material that follows is to give less‑informed readers without a background in computer principles and history sufficient information to enjoy the rest of the material in the book. If you understand how digital computers work, you probably already know the material cold, so feel free to skip to chapter 3.

The most fundamental difference we’ll see in future information is that almost all of it will be digital. Whole printed libraries are already being scanned and stored as electronic data on disks and CD‑ROMs. Newspapers and magazines are now often completely composed in electronic form and printed on paper as a convenience for distribution. The electronic information is stored permanently–or for as long as anyone wants it–in computer databases: giant banks of journalistic data accessible through on‑line services. Photographs, films, and videos are all being converted into digital information. Every year, better methods are being devised to quantify information and distill it into quadrillions of atomistic packets of data. Once digital information is stored, anyone with access and a personal computer can instantaneously recall, compare, and refashion it. What characterizes this period in history is the completely new ways in which information can be changed and manipulated, and the increasing speeds at which we can handle it. The computer’s abilities to provide low‑cost, high‑speed processing and transmission of digital data will transform the conventional communication devices in homes and offices.

The idea of using an instrument to manipulate numbers isn’t new. The abacus had been in use in Asia for nearly 5,000 years by 1642, when the nineteen‑year‑old French scientist Blaise Pascal invented a mechanical calculator. It was a counting device. Three decades later, the German mathematician Gottfried von Leibniz improved on Pascal’s design. His “Stepped Reckoner” could multiply, divide, and calculate square roots. Reliable mechanical calculators, powered by rotating dials and gears, descendants of the Stepped Reckoner, were the mainstay of business until their electronic counterparts replaced them. When I was a boy, a cash register was essentially a mechanical calculator linked to a cash drawer.

More than a century and a half ago, a visionary British mathematician glimpsed the possibility of the computer and that glimpse made him famous even in his day. Charles Babbage was a professor of mathematics at Cambridge University who conceived the possibility of a mechanical device that would be able to perform a string of related calculations. As early as the 1830s, he was drawn to the idea that information could be manipulated by a machine if the information could be converted into numbers first. The steam‑powered machine Babbage envisioned would use pegs, toothed wheels, cylinders, and other mechanical parts, the apparatus of the then‑new Industrial Age. Babbage believed his “Analytical Engine” would be used to take the drudgery and inaccuracy out of calculating.

He lacked the terms we now use to refer to the parts of his machine. He called the central processor, or working guts of his machine, the “mill.” He referred to his machine’s memory as the “store.” Babbage imagined information being transformed the way cotton was–drawn from a store (warehouse) and milled into something new.

His Analytical Engine would be mechanical, but he foresaw how it would be able to follow changing sets of instructions and thus serve different functions. This is the essence of software. It is a comprehensive set of rules a machine can be given to “instruct” it how to perform particular tasks. Babbage realized that to create these instructions he would need an entirely new kind of language, and he devised one using numbers, letters, arrows, and other symbols. The language was designed to let Babbage “program” the Analytical Engine with a long series of conditional instructions, which would allow the machine to modify its actions in response to changing situations. He was the first to see that a single machine could serve a number of different purposes.

For the next century mathematicians worked with the ideas Babbage had outlined and finally, by the mid‑1940s, an electronic computer was built based on the principles of his Analytical Engine. It is hard to sort out the paternity of the modern computer, because much of the thinking and work was done in the United States and Britain during World War II under the cloak of wartime secrecy. Three major contributors were Alan Turing, Claude Shannon, and John von Neumann.

In the mid‑1930s, Alan Turing, like Babbage a superlative Cambridge‑trained British mathematician, proposed what is known today as a Turing machine. It was his version of a completely general‑purpose calculating machine that could be instructed to work with almost any kind of information.

In the late 1930s, when Claude Shannon was still a student, he demonstrated that a machine executing logical instructions could manipulate information. His insight, the subject of his master’s thesis, was about how computer circuits–closed for true and open for false–could perform logical operations, using the number 1 to represent “true” and 0 to represent “false.”

This is a binary system. It’s a code. Binary is the alphabet of electronic computers, the basis of the language into which all information is translated, stored, and used within a computer. It’s simple, but so vital to the understanding of the way computers work that it’s worth pausing here to explain it more fully.

Imagine you have a room that you want illuminated with as much as 250 watts of electric lighting and you want the lighting to be adjustable, from 0 watt of illumination (total darkness) to the full wattage. One way to accomplish this is with a rotating dimmer switch hooked to a 250‑watt bulb. To achieve complete darkness, turn the knob fully counterclockwise to Off for 0 watt of light. For maximum brightness, turn the knob fully clockwise for the entire 250 watts. For some illumination level in between, turn the knob to an intermediate position.

This system is easy to use but has limitations. If the knob is at an intermediate setting–if lighting is lowered for an intimate dinner, for example–you can only guess what the lighting level is. You don’t really know how many watts are in use, or how to describe the setting precisely. Your information is approximate, which makes it hard to store or reproduce.

What if you want to reproduce exactly the same level of lighting next week? You could make a mark on the switch plate so that you know how far to turn it, but this is hardly exact, and what happens when you want to reproduce a different setting? What if a friend wants to reproduce the same level of lighting? You can say, “Turn the knob about a fifth of the way clockwise,” or “Turn the knob until the arrow is at about two o’clock” but your friend’s reproduction will only approximate your setting. What if your friend then passes the information on to another friend, who in turn passes it on again? Each time the information is handed on, the chances of its remaining accurate decrease.

That is an example of information stored in “analog” form. The dimmer’s knob provides an analogy to the bulb’s lighting level. If it’s turned halfway, presumably you have about half the total wattage. When you measure or describe how far the knob is turned, you’re actually storing information about the analogy (the knob) rather than about the lighting level. Analog information can be gathered, stored, and reproduced, but it tends to be imprecise–and runs the risk of becoming less precise each time it is transferred.

Now let’s look at an entirely different way of describing how to light the room, a digital rather than analog method of storing and transmitting information. Any kind of information can be converted into numbers using only 0s and 1s. These are called binary numbers–numbers composed entirely of 0s and 1s. Each 0 or 1 is called a bit. Once the information has been converted, it can be fed to and stored in computers as long strings of bits. Those numbers are all that’s meant by “digital information.”

Instead of a single 250‑watt bulb, let’s say you have eight bulbs, each with a wattage double the one preceding it, from 1 to 128. Each of these bulbs is hooked to its own switch, with the lowest‑watt bulb on the right. Such an arrangement can be diagrammed like this:

By turning these switches on and off, you can adjust the lighting level in 1‑watt increments from 0 watt (all switches off) to 255 watts (all switches on). This gives you 256 possibilities. If you want 1 watt of light, you turn on only the rightmost switch, which turns on the 1‑watt bulb. If you want 2 watts of light, you turn on only the 2‑watt bulb. If you want 3 watts of light, you turn on both the 1‑watt and 2‑watt bulbs, because 1 plus 2 equals the desired 3 watts. If you want 4 watts of light, you turn on the 4‑watt bulb. If you want 5 watts, you turn on just the 4‑watt and 1‑watt bulbs. If you want 250 watts of light, you turn on all but the 4‑watt and 1‑watt bulbs.

If you have decided the ideal illumination level for dining is 137 watts of light, you turn on the 128‑, 8‑, and 1‑watt bulbs, like this:

This system makes it easy to record an exact lighting level for later use or to communicate it to others who have the same light‑switch setup. Because the way we record binary information is universal–low number to the right, high number to the left, always doubling–you don’t have to write down the values of the bulbs. You simply record the pattern of switches: on, off, off, off, on, off, off, on. With that information a friend can faithfully reproduce the 137 watts of light in your room. In fact, as long as everyone involved double‑checks the accuracy of what he does, the message can be passed through a million hands and at the end every person will have the same information and be able to achieve exactly 137 watts of light.

To shorten the notation further, you can record each “off” as 0 and each “on” as 1. This means that instead of writing down “on, off, off, off, on, off, off, on,” meaning turn on the first, the fourth, and the eighth of the eight bulbs, and leave the others off, you write the same information as 1, 0, 0, 0, 1, 0, 0, 1, or 10001001, a binary number. In this case it’s 137. You call your friend and say: “I’ve got the perfect lighting level! It’s 10001001. Try it.” Your friend gets it exactly right, by flipping a switch on for each 1 and off for each 0.

This may seem like a complicated way to describe the brightness of a light source, but it is an example of the theory behind binary expression, the basis of all modern computers.

Binary expression made it possible to take advantage of electric circuits to build calculators. This happened during World War II when a group of mathematicians led by J. Presper Eckert and John Mauchly at the University of Pennsylvania’s Moore School of Electrical Engineering began developing an electronic computational machine, the ElectronicNumericalIntegratorAndCalculator, called ENIAC. Its purpose was to speed up the calculations for artillery‑aiming tables. ENIAC was more like an electronic calculator than a computer, but instead of representing a binary number with on and off settings on wheels the way a mechanical calculator did, it used vacuum tube “switches.”

Soldiers assigned by the army to the huge machine wheeled around squeaking grocery carts filled with vacuum tubes. When one burned out, ENIAC shut down and the race began to locate and replace the burned‑out tube. One explanation, perhaps somewhat apocryphal, for why the tubes had to be replaced so often was that their heat and light attracted moths, which would fly into the huge machine and cause short circuits. If this is true, it gives new meaning to the term “bugs” for the little glitches that can plague computer hardware or software.

When all the tubes were working, a staff of engineers could set up ENIAC to solve a problem by laboriously plugging in 6,000 cables by hand. To make it perform another function, the staff had to reconfigure the cabling–every time. John von Neumann, a brilliant Hungarian‑born American, who is known for many things, including the development of game theory and his contributions to nuclear weaponry, is credited with the leading role in figuring out a way around this problem. He created the paradigm that all digital computers still follow. The “von Neumann architecture,” as it is known today, is based on principles he articulated in 1945–including the principle that a computer could avoid cabling changes by storing instructions in its memory. As soon as this idea was put into practice, the modern computer was born.

Today the brains of most computers are descendants of the microprocessor Paul Allen and I were so knocked out by in the seventies, and personal computers often are rated according to how many bits of information (one switch in the lighting example) their microprocessor can process at a time, or how many bytes (a cluster of eight bits) of memory or disk‑based storage they have. ENIAC weighed 30 tons and filled a large room. Inside, the computational pulses raced among 1,500 electro‑mechanical relays and flowed through 17,000 vacuum tubes. Switching it on consumed 150,000 watts of energy. But ENIAC stored only the equivalent of about 80 characters of information.

By the early 1960s, transistors had supplanted vacuum tubes in consumer electronics. This was more than a decade after the discovery at Bell Labs that a tiny sliver of silicon could do the same job as a vacuum tube. Like vacuum tubes, transistors act as electrical switches, but they require significantly less power to operate and as a result generate much less heat and require less space. Multiple transistor circuits could be combined onto a single chip, creating an integrated circuit. The computer chips we use today are integrated circuits containing the equivalent of millions of transistors packed onto less than a square inch of silicon.

In a 1977 Scientific Americanarticle, Bob Noyce, one of the founders of Intel, compared the $300 microprocessor to ENIAC, the moth‑infested mastodon from the dawn of the computer age. The wee microprocessor was not only more powerful, but as Noyce noted, “It is twenty times faster, has a larger memory, is thousands of times more reliable, consumes the power of a lightbulb rather than that of a locomotive, occupies 1/30,000 the volume and costs 1/10,000 as much. It is available by mail order or at your local hobby shop.”

Соседние файлы в предмете [НЕСОРТИРОВАННОЕ]