Добавил:
Upload Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
Lectures_SSD2_Yermakova / Lectures_SSD2 Yermakova.doc
Скачиваний:
226
Добавлен:
25.02.2016
Размер:
3.16 Mб
Скачать

2.4.2 Lab: Researching a Computer System

You can research a computer system using the Web by retrieving product reviews and price comparisons. Suppose you are interested in a particular line of notebook computers, such as the Dell Inspiron or the Sony VAIO. The following activity will lead you through a sample comparison.

Learning Exercise:

  • Go to the Reviews section of Ziff Davis Web site (www.zdnet.com), and select a notebook machine that looks interesting.

  • Read the detailed review of the product and check the latest price information.

  • Ziff-Davis also publishes the magazine Computer Shopper and its companion Web site www.zdnet.com/computershopper.

  • You can also find product reviews and pricing info at the CNET Web site www.cnet.com.

-

2.4.3 Lab: Online Configuration

Some computer-vendor Web sites allow you to specify a system configuration by selecting from various menus listing available options. Then, when you click the "update price" button, you can see the exact price for the system you selected. Two such Web sites are Dell (www.dell.com) and Gateway, Inc. (www.gateway.com).

Learning Exercise:

  • Visit the site of a computer vendor. Assume that you have a budget of $1,200 and put together the specification for a computer that is appropriate for a college student studying Computer Science.

  • Now assume you're buying a notebook computer for a businessperson who is a frequent airline traveler and is concerned about weight and battery life. What can you get for $2,500?

-

2.5 Improving Computer Performance

How do you measure computer performance? And, how is computer performance being improved? These are some of the topics covered in this section.

Reading Sequence:

  • 2.5.1 Moore's Law. Learning Goal: Knowledge of the basis for the exponential growth in the computer's memory storage and computational abilities.

  • 2.5.2 Bottlenecks. Learning Goal: An understanding of performance bottlenecks and how to correct them.

  • 2.5.3 Throughput and Latency. Learning Goal: Definition of throughput and latency with respect to computer performance.

  • Parsons/Oja, Chapter 8-Section B: "Image Compression". Learning Goal: An understanding of how data compression can be used 1) to reduce the amount of space required to store files and 2) to improve throughput by reducing the number of bytes that must be transmitted.

                  

Assessments:

  • Multiple-Choice Quiz 7

-

2.5.1 Moore's Law

A transistor is an electronic switch that can alternate between two states, "on" and "off," representing one bit of information. Modern microchips contain millions of transistors, each so small that it cannot be seen with the naked eye. Gordon Moore, one of the founders of Intel, observed that in 1965, microchip capacity (the number of transistors contained within a silicon wafer) had doubled every year. This trend in computing, which has become known as Moore's Law, continues on into the present—although the rate of change has slowed recently so that chip capacity now doubles every 12-18 months, not every year. Moore's Law, an example of exponential growth, refers specifically to the capacity of microchips, and the law might be stated this way: the number of transistors that can be put on a microchip will double every 12-18 months, until physical limitations are reached.

To illustrate the power of exponential growth, consider the parable of the inventor of chess and his emperor. The emperor wanted to reward the inventor with anything he wanted for creating the game of chess. The inventor requested that he be given one grain of rice for the first square of the chessboard and that each additional square would double the previous square's amount of rice. The emperor immediately granted his wish. There are 64 squares on a chessboard. By the 32nd square, 4 billion grains of rice would have been given, that is about one large field's worth of rice. And, the next square would need about 2 million grains of rice, the next square about 4 million, the next square about 8 million, and so on. The 64th square would need 9*1018 grains of rice, more than the amount of rice that could be produced even if the entire earth's surface is used to grow rice.

The number of transistors on a single chip increased at such exponential rate, doubling every 12-18 months. Below is a graph illustrating the exponential increase in the number of transistors on processors introduced over the years.

Figure 1 Illustration of Moore's Law applied to Intel Processors

Below is the log scaled graph to provide you with a different perspective of the exponential growth of transistors on a microchip.

Figure 2 Illustration of Moore's Law applied to Intel Processors in log scale

For more recent data, see the press kit from Intel

With the exponential growth of transistor density on microchips, many inferences can be made that allow analysts to predict other developments in the computer industry. Extending the scope of Moore's Law, the following predictions can be made:

  1. Processing power (speed) doubles every 12-18 months.

  2. Storage capacity of RAM doubles every 12-18 months.

Other observations are that storage capacity of hard disk drives is also increasing exponentially, and the cost for consumers to purchase computer parts is decreasing over time.

The reason Moore's Law continues to hold true is that circuitry is becoming ever smaller. Circuits that used to require hundreds of square microns of silicon (a micron is a millionth of a meter) now fit into just a few square microns. This trend has enabled more and more circuits to be packed into the same area. Processors, memory chips, and special-purpose chips for controlling peripheral devices are all becoming denser. Although Moore's Law only predicts the increase in circuit density, this increase in density reduces the time required for inter-component communications, which also means that chips can process data faster.

Improvements in microchip technology are being matched by improvements in several other technologies found in computer systems. Disk capacity is increasing for a variety of reasons. Improvements in magnetic media (the iron oxide coating on the surface of a disk, flatter platters, etc.) and read/write electronics are increasing the capacity of hard disk drives. Introduction of new optical disk technologies is another source of increased storage capacity for personal computers. Corresponding increases in processor speed and bus bandwidth enable computers to take full advantage of the growth in storage capabilities.

Despite the growth in processing speed and storage capacity, the cost per byte of data processed or stored decreases as lower-capacity memory chips become out-dated. For instance, the cost of a 64MB RAM a couple of years ago is now about the same as the cost of a 128MB RAM.

An interesting counter to improvements in capacity and throughput is known as Parkinson's Law of Data, which says that data expands to fill the space available. In other words, as more memory or disk space becomes available, the demand for more memory or disk space increases accordingly. For example, when computers had only a few kilobytes (KB) of memory, their simple operating systems fit in as little as 4 KB. Today's microcomputers typically have 128 MB or more of memory and, as Parkinson's Law would predict, today's operating systems are much more elaborate and require tens of megabytes of memory for their own use. Similarly, as disk drive capacity increases, people begin using them in new ways. Early computers with 360 KB floppy disks mainly stored small text files. Today, when computers routinely come with multi-gigabyte hard drives, people store musical recordings, short video clips (each file several megabytes in length), and even collections of feature-length films on DVD (typically about 5 gigabytes).

Parkinson's Law drives the entire computing industry, through the knowledge that applications will always keep pace with Moore's Law. As capacity increases, users would ask for even more performance in order to accomplish more ambitious tasks. Thanks to Moore's Law, we can expect to see continued technological improvements to meet consumer demand for greater performance at affordable prices. (But, note that Moore's Law doesn't cover all aspects of computer technology. It says nothing about increases in system reliability, or about the quality of the software programs used in computer systems.)

Without fundamental changes in chip technology, the laws of physics suggest that there are limits to how far we will be able to improve computing performance. For example, the circuit pathways have to be wide enough for electrons to pass through. Another limitation is the wavelength of light. Light is used to etch circuits into silicon, and the width of the pathways etched is related directly to the wavelength of the light used to do the etching—the shorter the wavelength, the narrower the pathway. Ultraviolet light has a shorter wavelength than visible light, and X-rays are shorter still. But, there are technical problems with using wavelengths that short. What happens when the limit is reached? We don't know, but experience suggests that progress will continue, possibly in unanticipated directions. At some point, the cost of producing ultra-dense chips may restrict their use to the most expensive supercomputers.

-