Добавил:
Опубликованный материал нарушает ваши авторские права? Сообщите нам.
Вуз: Предмет: Файл:
Gary S. Coyne - A Practical Guide to Materials, Equipment, and Technique.pdf
Скачиваний:
142
Добавлен:
15.08.2013
Размер:
8.47 Mб
Скачать

Measurement: The Basics 2.1

71

were acceptable as fundamental standards on which to base our English units. Thus, the metric system was not implemented by the government and was not adopted by the nation at large.

By 1968 the United States was the only major country that had not adopted the SI system. Congress ordered an investigation to determine whether the United States should follow the International Standard. In its 1971 report, Congress recommended that we have a coordinated program of metric conversion such that within 10 years we would be a "metric country." With this hope in mind, President Ford signed the Metric Conversion Act of 1975. Unfortunately, as of the writing of this book, in 1997, we see little evidence of "metrification." In the near future there may be difficulties for U.S. manufacturers because the European Common Market has decided that by 1992, it will no longer accept items not made to metric standards. Added to this is ISO 9000, a program to formalize procedures and methodology. Although not specifically required, it indirectly provides an advantage to those using metric systems.

If you look at the history of measurement, it shows constant attempts to better define and refine our measurement units. Early measurement standards were arbitrary such as the Egyptian cubit (the tip of the finger to the elbow). Since then we have tried to base measurement standards on non-changing, consistent, and repeatable standards, some of which turned out to be inconsistent and/or impractical. The work of the metrologist will never be complete. The more accurately we can define our measurement standards, the better we can measure the properties of our universe.

2.1.3 The Base Units

This book focuses in on the original base units (length, mass, and temperature), plus one of the derived units (volume), because these measurements are the most commonly used measurements in the lab. Time is included in this section only because it provides an interesting commentary on the metrologists' desire to split hairs in their endeavor to achieve accuracy.

Length. The original metric standard for the length of a meter was "one tenmillionth part of a quadrant of the earth's meridian." Astronomical measurements (at that time) indicated that one-tenth of a quadrant of the earth's meridian was measured between Dunkirk, in France, and a point near Barcelona, Spain. This distance was dutifully measured and divided by one million to obtain the meter. It is fortunate that the meter was later redefined, because if you take what we now call a meter and measure the distance between the Dunkirk and Barcelona points used, you get the length space of 1,075,039 meters. This space is an accuracy of only 7.5 in 100 meters.

Reproducible accuracy was increased substantially by the development and use of the International Prototype Metre, the platinum-iridium bar. By physically

72

Measurement

comparing the lines on this bar with a secondary prototype, an accuracy was achieved to within two parts in 10 million.

In 1960 (at the same meeting during which the International System of Units was accepted) the meter was redefined as being 1,650,763.73 wavelengths (in vacuum) of the orange-red spectral line of krypton-86. This system had several problems, chief of which was that this figure could only be reached by extrapolation because it was not possible to extend an accurate quantity of waves beyond some 20 centimeters. To obtain the required number of wavelengths for a meter, several individual measurements were taken in succession and added together. This measurement procedure, because of its nature, increased the likelihood for error. Despite the limitations of the krypton-86 based meter, it brought the accuracy to within two parts in 100 million.

In 1983 yet another definition of the meter was adopted: It was the distance that light would travel (in a vacuum) during 1/299,792,458 of a second. This attempt provided a measurement ten times more accurate than that obtained through kryp- ton-86 techniques. This meter, accurate to within four parts in one billion, is still in use today.

Mass. The kilogram is unique in several aspects within the metric system. For one, it is the only base unit that includes a prefix in its name (the gram is not a base unit of the International System of Units). Also, it is the only base unit defined by a physical object as opposed to a reproducible physical phenomenon. The kilogram was based on the weight of one cubic decimeter of water at its most dense state (4°C). The original "standard mass kilogram" was made by constructing a brass weight, making careful weighings (using Archimedes' Principle*), and then making an equivalent platinum weight. The platinum weight is called the Kilogramme des Archives and is kept at the International Bureau of Weights and Measures near Paris, France.1

About a hundred years later during studies verifying previous data, it was discovered that a small error in the determination of the kilogram had been made. The problem was caused by the consequences of very small variations of temperature causing very small variations in density, which in turn caused inaccurate weighings. The CGPM decided against changing the mass of the kilogram to a corrected amount; rather, it accepted the mass of the kilogram as being that obtained from the original Kilogramme des Archives.

It has been hoped that there will be some naturally occurring phenomenon in nature to which we can ascribe the value of the kilogram, thus allowing the kilogram to be based on a reproducible phenomenon rather than relying on a physical

Simply stated, a submerged body is buoyed up by a force equal to the weight of the water that the submerged body displaces. In other words, if an object weighs 1.5 kilograms in air, and the weight of the water it displaces is equal to 1.0 kilogram, the object will weigh 0.5 kilograms in water.

+A copy of the prototype is held by the National Institute of Standards and Technology (formally the National Bureau of Standards) and by all other countries who have signed agreements following the International System of Units.

Measurement: The Basics 2.1

73

artifact. Although there have been several efforts toward this goal, such as counting molecules, our current technological level cannot achieve any greater accuracy than that obtained by simply comparing weights of unknowns against our current prototypes.

Volume. As previously mentioned, the unit for volume (the liter) was to be one cubic decimeter of pure water at its most dense state (4°C). Later analysis determined that errors were made in the determination of the kilogram and that the mass of one decimeter of water was slightly less than the prototype kilogram. However, the use of the kilogram as it had already been defined was so well established that change became impossible. Thus, in 1872, the "Commission Internationale du Metre" met to redefine the kilogram as the mass of a particular standard weight (the Prototype Kilogram) instead of the weight of a liter of water.

The concept of the liter was cast into doubt. Was it to be based on the weight of a standard volume of dense water as before, or was it to be an alternate name of the cubic decimeter?

The first attempt to resolve the conflict was in 1901 when the "Comite International des Poinds et Mesures" resolved that "The unit of volume for determinations of high precision is the volume occupied by a mass of 1 kilograms of pure water at its maximum density and under normal atmospheric pressure." This volume, so defined, was identified as a "liter."1 However, this designation still left the discrepancy that this definition actually redefined the liter to be equivalent to 1.000027 dm3. There was once again a discrepancy as to what a liter was. So, at the 12th conference of the "Comite International des Poinds et Mesures," the unit of volume was redefined for the last time as the cubic decimeter. The liter, no longer the official unit of volume, was nevertheless so close to the cubic decimeter in size that unofficially it was deemed acceptable to continue its use; however, for situations that required extremely high-quality measurements, only the cubic decimeter was acceptable. The cubic decimeter remains the standard of volume today.

Temperature. Temperature was not one of the original properties that the French academy deemed necessary to include in the metric system. In fact, as late as 1921, members of the 6th General Conference of the International System of Weights and Measures were still objecting to the inclusion of measurements (other than length and mass) seemingly for no other reason other than to keep the base units "pure."

The measurement of temperature is a measurement of energy and therefore has different measurement characteristics than other properties. The primary difference in temperature measurement is that it is not cumulative. You can take two different meter sticks and place them end to end to measure something longer than one meter. The comparable action can be made for the measurement of mass.

74

Measurement

However, you cannot take two thermometers and add the temperature of the two if a material's temperature is greater than the scale of either thermometer.

The first thermometer is believed to have been created by Galileo sometime between 1592 and 1598. It was a glass sphere attached to one end of a long, thin glass tube. The other end of the glass tube was lowered vertically into a bowl of colored water (or perhaps wine). Using the warmth of his hands, Galileo heated the glass sphere, causing bubbles to come out of the glass tube. Then, by letting the sphere cool to room temperature, Galileo caused the liquid to be drawn up into the tube. By heating and/or cooling the glass sphere, the liquid would ride up and down the tube.

Although there is no record that he ever calibrated the tube, he used it in temperature study. Galileo's thermometer was impossible to calibrate even if he had decided on fixed points with which to establish specific temperatures because it was exposed to the atmosphere and subject to variations in atmospheric pressure. By 1640, it was realized that the "air thermometer" was subject to variations of barometric pressure and the sealed thermometer was created. However, the need to establish fixed points of reference had still not been addressed.

The need to establish fixed points to provide uniformity between thermometers is no different than the need for uniformity with any measurement system. Many thermometers were built in the following years, but either there was no calibration, or the calibration was not based on any repeatable fixed point. One of the first attempts of establishing calibration points was made in 1693 by Carlo Renaldini of Padua, who set the low temperature point with ice. For the next point, he took 11 ounces of boiling water and mixed it with one ounce of cold water. Next he mixed 10 ounces of boiling water with two ounces of ice water. The process continued until 12 divisions were established on the thermometer. Although Renaldini had an interesting approach to establishing fixed points on the thermometer, it was neither practical nor accurate.

The first commonly used temperature measurement scale was devised in 1724 by the Dutch scientist D. Gabriel Fahrenheit. It took 16 years for Fahrenheit to devise a process for calibrating his scales. Fahrenheit used three points for calibration. The lowest, 0°, was the lowest temperature he could create by mixing ice, water, and ammonium chloride (a slush bath of sorts). The second temperature was ice, at 32°, and the high temperature point was the human body temperature, 96°. The choice of identifying the low temperature was logical, but the choice of the high temperature seems illogical. Lindsay2 assumed that 96 was chosen because it was an even multiple of 12. However, I haven't seen any evidence that the Fahrenheit scale was ever divided into multiples of 12. Also, the human body is closer to 98°. If the scale was originally based on a duodecimal system and changed to a decimal system, the human body would be 100° Fahrenheit. Regardless, if the human body were fixed at 96°, that fixed point would not change as

Measurement: The Basics 2.1

75

thermometers were made more accurate. Regardless of the logic Fahrenheit used for his scale, his quality was excellent and his thermometers became very popular.

In 1742 Anders Celsius, a Swedish astronomer, developed the mercury centigrade thermometer. He chose the boiling and freezing points of water as calibration points. Curiously, he chose 0° for the high temperature and 100° for the low temperature.* His choices were reversed in 1850 by Marten Stromer, also a Swedish astronomer. In 1948 the centigrade scale was officially renamed the Celsius scale.

During the early 19th century, Lord Kelvin theorized that as temperature drops, so does thermal motion. Thus, 0° should be the point at which there is zero motion. This new 0° temperature would be equal to -273.15 degrees Celsius. Fortunately, Kelvin had the foresight to keep things simple, and made a 1-degree increment Kelvin exactly equal to a 1-degree increment Celsius. Originally, the temperature Kelvin was capitalized. Now, it is not capitalized, and should be written as kelvin. The abbreviation of kelvin is capitalized and written as K.f

There are two other temperature scales that still may be seen in old texts or journals, but are not acceptable for any current scientific work. Perhaps the rarer is the Reaumur scale (°Re). It separated the range between freezing and boiling of water into 80 units and was used in parts of Europe. The other temperature scale, the Rankine, may be referred to in old books on thermodynamics. It was named after W. J. M. Rankine, who did early research in that field. The Rankine is to Fahrenheit what Kelvin is to Celsius. In other words, just as one degree K = one degree C, one degree F = one degree R. Thus, 0 K = 0°R = -273.15°C = -459.67°F.

In 1954 the General Conference wanted to redefine the temperature scale using various primary points in addition to the two points of freezing and boiling water. The triple point of water (at 273.16 K) proved easy to obtain and very accurate (one part in a million). In 1960 the triple point of water and five other fixed points were accepted for an International Practical Temperature Scale.* This scale was superseded in 1968 by the International Practical Temperature Scale (IPTS 1968), which added eight more fixed points. The current scale is shown in Table 2.29.

Time. How long is a second, and how do you store that measurement? Various attempts at measuring the length of a pendulum swing proved inadequate for the task of accurately measuring the points of a given swing. The people who devised very complex and systematic methods for defining the metric system did nothing for time. During that period of time, 1/86,400 of a mean solar day was considered to be an adequate definition of one second. However, by the mid-1900s the mean solar day was found to vary by as much as three seconds per year. In 1956 the

*The Royal Society of London in the early 1700s used a reversed scale that describes 0° as "extreme

hot" and 90° as "extreme cold."

Measurements in kelvin are not preceded by the "°" symbol.

*Six different points were adopted because no one thermometer can read a full range of temperatures, and no one thermometer can read a wide range of temperatures accurately. Thus, different fixed points allowed for thermometers measuring different ranges to be accurately calibrated.

Соседние файлы в предмете Химия