In a recently published video, John Christy explains clearly the limits of scientists’ understanding of earth’s climate system. It is well worth anyone’s time to view.
Dr. Christy makes the important point that all science is based upon objective measurements of the world. Feelings, intuitions, anecdotes and shared opinions do not provide proof for a scientific understanding of something. Science requires data, numerical records of observed measurements.
This post is about how much we owe to ancestors who invented standardized units of weights and measures without which we would have no science at all.
It happened last week that my home north of Montreal was without electrical power for 3 nights and 2 days. The whole experience drove home how much our lives depend on reliable, affordable electricity. Yes, our home heating system is electrical.
My e-readers’ batteries ran out, leaving me to read real paper books by the light of our hurricane lamp. Thus, I revisited a book from many years ago that provides much interesting information on this subject: Charles Panati’s Browser’s Book of Beginnings: Origins of Everything Under, and Including the Sun.
CHARLES PANATI, a former physicist and for six years a science editor for Newsweek, is the author of many non-fiction and fiction books, including six works on “origins.” The text below comes from Panati, the images from various internet sources.
To measure lengths, the Egyptians turned to parts of the human body. We know many of these measurements by terms later derived from Latin. A cubit, the oldest enduring standard measure, devised about 3000 B.C. was the length of a grown man’s arm from the elbow to the tip of the outstretched middle finger–about 20.5 inches in modern units. The cubit’s basic sub-unit was a digit, which was the breadth, not the length of a finger. Twenty-eight digits equaled 1 cubit.
The palm, not surprisingly, was another unit. One palm equaled 4 digits. (Measure it yourself, by holding the four fingers of one hand against the other hand’s palm.) A palm plus a digit, totaled 5 digits, or a hand. Palms were combined to make several larger units, and a digit was elaborately subdivided, resulting in a complex, but amazingly accurate system of measurement.
The Great Pyramid of Giza, built by thousands of workers with minimal architectural knowledge, boasts sides that vary no more than 0.05 percent from the mean length–that is, a deviation of only 4.5 inches over a span of 755 feet.
The ancient Greeks borrowed from Egyptian and Babylonian systems and made their own refinements; they also preferred terms related to the human body. 16 fingers combined to make 1 foot, and 24 fingers made an “Olympic cubit.” The Romans copied from the Greeks, but subdivided the foot into 12 inches. They also used the mile, the yard and, for weight, the pound.
A system of standard weights based on the human body was unfeasible, since there were too many natural variations to rely on an average man. Instead, the Babylonians devised a system based on metal objects, or trinkets, of various sizes and shapes.
The earliest unit of weight was the mina. Minas often took the shape of a duck, and each of several unearthed at a archaeological dig weigh roughly 640 grams. Also discovered was a swan weighing 30 minas. The Babylonians also used standard size “coins” from which the Hebrews adopted their unit of weight, the “shekel”, about half an ounce, and also a silver coin weighing that amount, frequently mentioned in the Bible.
The Metric Revolution
Almost all of the ancient and medieval weights and measures fell into disuse, to be replaced by the metric system. The French Revolution was not only political, but overturned many previously sacrosanct institutions. With the fall of the Bastille July 4, 1789, King Louis XVI had to give way to a constituent National Assembly, who proceeded to make many changes. Prominent among them was the adoption in June 1799 of the metric system.
Members of the French Academy of Sciences had taken on the task of devising a metric system. They decided that the length of the meridian passing through Paris from the North Pole to the Equator should serve as a fixed distance, and that one ten-millionth of that distance should be called a meter. The unit of weight, the gram, was to be related to the weight of a cubic meter of water. Sub-units such as centimeter and millimeter were also proposed, as well as such super-units as the kilometer.
The metric system was adopted under the motto “For all people, all the time”, a sentiment in accord with the revolutionary tenor of the time.
Many are aware that the earliest reckoning of time referred to moons (or months), but as civilizations became more complex, shorter periods proved more convenient as measurements of time. For a long while, the idea of a week was different from place to place: West Africans had a four day week, central Asians opted for five days, Assyrians adopted a six-day weeks, being the period between market days.
It was the Babylonians who preferred to measure a month by its natural phase of 28 days (more accurately the moon’s waxing and waning takes approximately 29.5 days. For convenience in business transactions–and also because of their belief in the sacredness of the number seven–they grouped the days into four seven-day weeks, the origin of our present system.
The ancient Greeks could have invented the thermometer, since they were well aquainted with the behavior of certain liquids and gases under conditions of changing temperature. Several scientists attempted to measure quantitative differences between hot and cold, but success came only late in the 16th century to the Italian astronomer Galileo.
Galileo’s device was actually a thermoscope, which had no degree scale, and measured only gross changes in temperature. A large glass bulb with a long, narrow, open-mouthed neck rested inverted over a vessel of colored water or alcohol. When air was forced from the bulb, the liquid rose up a short distance into the neck. When the bulb’s temperature changed, the air in it either expanded or contracted, and the level of liquid in the tube changed accordingly.
In 1611, the first scale was introduced by Sanctorius, a contemporary of Galileo. He gauged the low point by noting the level of the liquid when the thermoscope was surrounded by melting snow. Then he held a candle beneath it to mark the high point. From his observations, he arrived at a scale of 110 equal parts, or degrees. Thus, the thermo-scope, for “seeing” temperature changes, became a thermo-meter, for measuring those changes.
Early thermometers were inaccurate due to changes in barometric pressures causing liquid levels to change when temperatures did not. This problem was solved in 1644 when Grand Duke Ferdinand II of Tuscany introduced the hermetically sealed thermometer. He also founded in 1657 an academy for experimentation to improve temperature devices. They did not use mercury as modern models do (though academy members experimented with that liquid metal), but red wine instead, since it expanded faster when heated.
These are but a few, mostly ancient, examples of human inventions contributing to the rich scientific framework we have inherited. Many more have been added in modern times, and who knows what the future will bring. Below is a whimsical look at some possibilities.
Since science depends on measuring things, you need to know the correct units for what you are studying. Below are some obscure measures for special situations.
Footnote on the Importance of Measurements