Natural History Magazine
Published under the title “Naming Rights.”
How to stake a claim in the dictionary of science.
If you visit the gift shop at the Hayden Planetarium in New York City, you’ll find all manner of space-related paraphernalia for sale. Familiar things are there—plastic models of the Space Shuttle and the International Space Station, cosmic refrigerator magnets, Fisher space pens. But unusual things are there too—dehydrated astronaut ice cream, astronomy Monopoly, Saturn-shaped salt-and-pepper shakers. And that’s not to mention the weird things such as Hubble Telescope pencil erasers, Mars rock super-balls, and edible space worms. With hindsight, you’d expect a place like the planetarium to stock such stuff. But something much deeper is going on. The gift shop bears silent witness to the iconography of a half-century of American scientific discovery.
In the twentieth century, astronomers in the United States discovered galaxies, the expanding universe, the nature of supernovas, quasars, black holes, gamma ray bursts, the origin of the elements, the cosmic microwave background, and most of the known planets in orbit around solar systems other than our own. Although the Russians reached one or two places before us, we sent space probes to Mercury, Venus, Jupiter, Saturn, Uranus, and Neptune. American probes have also landed on Mars and on the asteroid Eros. American astronauts have walked on the Moon. And nowadays most Americans take all this for granted, which is practically a working definition of culture: something everyone does or knows about, but no longer actively notices.
While shopping at the supermarket, most Americans aren’t surprised to find an entire aisle filled with sugar-loaded, ready-to-eat breakfast cereals. But foreigners notice this kind of thing immediately, just as traveling Americans immediately notice that supermarkets in Italy have vast selections of pasta, and that markets in China and Japan offer an astonishing variety of rice. The flip side of not noticing your own culture is one of the great pleasures of foreign travel: realizing what you hadn’t noticed about your own country, and noticing what the people of other countries no longer realize about themselves.
Snobby people from other countries like to make fun of the U.S. for its abbreviated history and its uncouth culture, particularly compared with the millennial legacies of Europe, Africa, and Asia. But five hundred years from now historians will surely see the twentieth century as the American century—the one in which American discoveries in science and technology, rank high among the world’s list of treasured achievements.
Obviously the U.S. has not always sat atop the ladder of science. And there’s no guarantee or even likelihood that American preeminence will continue. As the capitals of science and technology move from one nation to another, rising in one era and falling in the next, each culture leaves its mark on the continual attempt of our species to understand the universe and our place in it. When historians write their accounts of such world events, the traces of a nation’s presence on center stage sit prominently in the timeline of civilization.
Many factors influence how and why a nation will make its mark at a particular time in history. Strong leadership matters. So does access to resources. But something else must be present—something less tangible, but with the power to drive an entire nation to focus its emotional, cultural, and intellectual capital on creating islands of excellence in the world. Those who live in such times often take for granted what they have created, on the blind assumption that things will continue forever as they are, leaving their achievements susceptible to abandonment by the very culture that created it.
Beginning in the 700s and continuing for nearly 400 years—while Europe’s Christian zealots were disemboweling heretics—the Abbasid caliphs created a thriving intellectual center of arts, sciences, and medicine for the Islamic world in the city of Baghdad. Muslim astronomers and mathematicians built observatories, designed advanced timekeeping tools, and developed new methods of mathematical analysis and computation. They preserved the extant works of science from ancient Greece and elsewhere and translated them into Arabic. They collaborated with Christian and Jewish scholars. And Baghdad became a center of enlightenment. Arabic was, for a time, the lingua franca of science.
The influence of these early Islamic contributions to science remains to this day. For example, so widely distributed was the Arabic translation of Ptolemy’s magnum opus on the geocentric universe, (originally written in Greek in AD 150), that even today, in all translations, the work is known by its Arabic title Almagest, or “The Greatest.”
The Iraqi mathematician and astronomer Muhammad ibn Musa al-Khwarizmi gave us the words algorithm, (from his name, al-Khwarizmi) and algebra (from the word al-jabr in the title of his book on algebraic calculation). And the world’s shared system of numerals—0, 1, 2, 3, 4, 5, 6, 7, 8, 9—though Hindi in origin, were neither common nor widespread until Muslim mathematicians exploited them. The Muslims, furthermore, made full and innovative use of the zero, which did not exist among Roman numerals or in any established numeric system. Today, with legitimate reason, the ten symbols are internationally referred to as Arabic numerals.
Portable, ornately etched, brass astrolabes were also developed by Muslims, from ancient prototypes, and became as much works of art as tools of astronomy. An astrolabe projects the domed heavens onto a flat surface and, with layers of rotating and non-rotating dials, resembles the busy, ornate face of a grandfather clock. It enabled astronomers, as well as others, to measure the positions of the Moon and the stars on the sky, from which they could deduce the time—a generally useful thing to do, especially when it’s time to pray. The astrolabe was so popular and influential as a terrestrial connection to the cosmos that, to this day, nearly two-thirds of the brightest stars in the night sky retain their Arabic names.
The name typically translates into an anatomical part of the constellation being described. Famous ones on the list (along with their loose translations) include: Rigel (Al Rijl, “foot”) and Betelgeuse (Yad al Jauza, “hand of the great one,”—in modern times drawn as the armpit), the two brightest stars in the constellation Orion; Altair (At-Ta’ir, “the flying one”), the brightest star in the constellation Aquila, the eagle; and the variable star Algol (Al-Ghul, “the ghoul”), the second brightest star in the constellation Perseus, referring to the blinking eye of the bloody severed head of Medusa held aloft by Perseus. In the less-famous category are the two brightest stars of the constellation Libra, athough identified with the scorpion in the heyday of the astrolabe: Zubenelgenubi (Az-Zuban al-Janubi, “southern claw”) and Zebueneschamali (Az-Zuban ash-Shamali, “northern claw”), the longest surviving star names in the sky.
At no time since the eleventh century has the scientific influence of the Islamic world been equal to what it enjoyed the preceding four centuries. The late Pakistani physicist Abdus Salam, the first Muslim ever to win the Nobel Prize, lamented:
There is no question [that] of all civilizations on this planet, science is the weakest in the lands of Islam. The dangers of this weakness cannot be overemphasized since honorable survival of a society depends directly on strength in science and technology in the conditions of the present age.
Plenty of other nations have enjoyed periods of scientific fertility. Think of Great Britain, and the basis of Earth’s system of longitude. The prime meridian is the line that separates geographic east from west on the globe. Defined as zero degrees longitude, it bisects the base of a telescope at an observatory in Greenwich, a London borough on the south bank of the River Thames. The line doesn’t pass through New York City. Or Moscow. Or Beijing. Greenwich was chosen in 1884 by an international consortium of longitude mavens who met in Washington DC for that very purpose.
By the late nineteenth century, astronomers at the Royal Greenwich Observatory—founded in 1675 and based, of course, in Greenwich—had accumulated and catalogued a century’s worth of data on the exact positions of thousands of stars. The Greenwich astronomers used a common, but specially designed telescope, constrained to move along the meridional arc that connects due north to due south through the observer’s zenith. By not tracking the general east to west motion of the stars, they simply drift by as Earth rotates. Formally known as a transit instrument, such a telescope allows you to mark the exact time a star crosses your field of view. Why? A star’s “longitude” on the sky is the time on a sidereal clock the moment the star crosses your meridian. Today we calibrate our watches with atomic clocks, but back then there was no timepiece more reliable than the rotating Earth itself. And there was no better record of the rotating Earth than the stars that passed slowly overhead. And nobody measured the positions of passing stars better than the astronomers at the Royal Greenwich Observatory.
During the seventeenth century, Great Britain had lost many ships at sea due to the challenges of navigation that result from not knowing your longitude with precision. In an especially tragic disaster in 1707, the British fleet, under Vice Admiral Sir Clowdesley Shovell, ran aground into the Scilly Isles, west of Cornwall, losing four ships and two thousand men. Finally enough impetus for England to commission a Board of Longitude, which offered a fat cash award—£20,000—to the first person who could design an ocean-worthy chronometer. Such a timepiece was destined to be important in both military and commercial ventures. When synchronized with the time at Greenwich, such a chronometer could determine a ship’s longitude with great precision. Just subtract your local time (readily obtained from the observed position of the Sun or stars) from the chronometer’s time. The difference between the two is a direct measure of your longitude east or west of the prime meridian.
In 1735 the Board of Longitude’s challenge was met by a portable, palm-sized clock designed and built by an English mechanic, John Harrison. Declared to be as valuable to the navigator as a live person standing watch at a ship’s bow, Harrison’s chronometer gave renewed meaning to the word “watch.”
Because of England’s sustained support for achievements in astronomical and navigational measurements, the Royal Observatory at Greenwich landed the prime meridian. This decree fortuitously placed the international date line (180° away from the prime meridian) in the middle of nowhere, on the other side of the globe in the Pacific Ocean. No country would be split into two days, leaving it beside itself on the calendar.
From the 1890s until the 1930s British institutions also made stunning advances in physics. Atoms are mostly empty space, with a small, dense nucleus packed with positively charged protons and neutral neutrons. Together, they are surrounded by negatively charged electrons. These particles are the principal components of atoms themselves. We take this fundamental knowledge for granted, as though it had been known forever. But using clever tabletop experiments, as well as early versions of particle accelerators, it was J. J. Thompson who discovered the electron in 1897, Ernest Rutherford who discovered the proton in 1914, and James Chadwick discovered the neutron in 1932.
Impressed it was all done in the same country? It all happened in the same building: the Cavendish Laboratory at the University of Cambridge. And it was data from these labs that forced a new generation of theorists to abandon classical concepts of physics in favor of the new branch of science known as quantum mechanics, a description of matter and energy that applies to nature on its smallest scales. To the world’s community of physicists, the original Cavendish Laboratories are hallowed ground.
If the English have forever left their mark on particles and on the spatial coordinates of the globe, our basic temporal coordinate system—a solar-based calendar—is the product of an investment of science within the Roman Catholic Church. The incentive to do so was not driven by cosmic discovery itself but by the need to keep the date for Easter in the early spring. So important was this need, that Pope Gregory XIII established the Vatican Observatory, staffing it with erudite Jesuit priests who tracked and measured the passage of time with unprecedented accuracy. By decree, the date for Easter had been set to the first Sunday after the first full moon after the vernal equinox (preventing Holy Thursday, Good Friday and Easter Sunday from ever falling on a special day in somebody else’s lunar-based calendar.) That rule works as long as the first day of spring stays in March, where it belongs. But the Julian calendar of Julius Caesar’s Rome was sufficiently inaccurate that by the sixteenth century it had accumulated ten extra days, placing the first day of spring on April 1 instead of March 21. The four-year leap day, a principal feature of the Julian calendar, had slowly overcorrected the time, pushing Easter later and later in the year.
In 1584, when all the studies and analyses were complete, Pope Gregory deleted the ten offending days from the Julian calendar: the day after October 4 was declared to be October 15. The Church thenceforth made an adjustment: for every century year not evenly divisible by four-hundred, a leap day gets omitted that would otherwise have been counted, thus correcting for the overcorrecting leap day itself.
This new “Gregorian Calendar” was further refined in the twentieth century to become even more precise, preserving the accuracy of your wall calendar for tens of thousands of years to come. Nobody else had ever kept time with such precision. Enemy states of the Catholic Church (such as Protestant England, and its rebellious progeny, the American colonies) were slow to adopt the change, but eventually everyone in the civilized world, including cultures that traditionally relied on Moon-based calendars, adopted the Gregorian calendar as the standard for international business, commerce, and politics.
Ever since the birth of the Industrial Revolution the European contributions to science and technology have become so embedded in western culture that it may take a special effort to step outside and notice them at all. The Revolution was a breakthrough in our understanding of energy enabling engineers to dream up ways to convert it from one form to another. In the end, the Revolution would serve to replace human power with machine power, drastically enhancing the productivity of nations and the subsequent distribution of wealth around the world.
The language of energy is rich with the names of those scientists who contributed to the effort. James Watt, the Scottish engineer who perfected the steam engine in 1765, has the moniker best known outside the circles of engineering and science. Either his last name or his monogram gets stamped on the top of practically every light bulb. A bulb’s wattage measures the rate it consumes energy, which correlates with its brightness. Watt worked on steam engines while at the University of Glasgow, which was, at the time, one of the world’s most fertile centers for engineering innovation.
The English physicist Michael Faraday discovered electromagnetic induction in 1831, which enabled the first electric motor. The farad, a measure of a device’s capacity to store electric charge, probably doesn’t do full justice to his contributions to science.
The German physicist Heinrich Hertz discovered electromagnetic waves in 1888, which enabled communication via radio; his name survives as the unit of frequency along with its metric derivatives “kilohertz,” “megahertz,” and “gigahertz.”
From the Italian physicist Alessandro Volta we have the volt, a unit of electric potential. From the French physicist André-Marie Ampère, we have the unit of electric current known as the ampere, or “amp” for short. From the British physicist James Prescott Joule, we have the joule, a unit of energy. The list goes on and on.
With the exception of Benjamin Franklin and his tireless experiments with electricity, the U.S. as a nation watched this fertile chapter of human achievement from afar, preoccupied with gaining its independence from England and exploiting the economies of slave labor. Today the best we could do was pay homage in the original Star Trek television series: Scotland is the country of origin of the industrial revolution, and of the Chief Engineer of the star ship Enterprise. His name? “Scotty” of course.
In the late eighteenth century the Industrial Revolution was in full swing, but so too was the French Revolution. The French used the occasion to shake up more than the royalty; they also introduced the metric system to standardize what was then a world of mismatched measures—confounding science and commerce alike. Members of the French Academy of Sciences led the world in measures of the Earth’s shape and had proudly determined it to be an oblate spheroid. Building on this knowledge, they defined the meter to be one ten-millionth the distance along the Earth’s surface from the North Pole to the equator, passing through—where else?—Paris. This measure of length was standardized as the separation between two marks etched on a special bar of platinum alloyed with iridium. The French devised many other decimal standards that (except for decimal time and decimal angles) was ultimately adopted by all the civilized nations of the world except the U.S., the west African nation of Liberia, and the politically unstable, tropical nation of Myanmar. The original artifacts of this metric effort are preserved at the International Bureau of Weights and Measures—located, of course, near Paris.
Beginning in the late 1930s the U.S. became a nexus of activity in nuclear physics. Much of the intellectual capital grew out of the exodus of scientists from Nazi Germany. But the financial capital came from Washington, in the race to beat Hitler to build an atomic bomb. The coordinated effort to produce the bomb, was known as the Manhattan Project, so named because much of the early research had been done in Manhattan, at Columbia University’s Pupin Laboratories.
The wartime investments had huge peacetime benefits for the community of nuclear physicists. From the 1930s through the 1980s, American accelerators were the largest and most productive in the world. These race-tracks of physics are windows into the funamental structure and behavior of matter. They create beams of subatomic particles, accelerate them to near the speed of light with a cleverly configured electric field, and smash them into other particles, busting them to smithereens. Sorting through the smithereens, physicists have found evidence for hoards of new particles and even new laws of physics.
American nuclear physics labs are duly famous. Even people who are physics-challenged will recognize the top names: Los Alamos; Lawrence Livermore; Brookhaven; Lawrence Berkeley, Fermi Labs; Oak Ridge. Physicists at these places discovered new particles, isolated new elements, informed a nascent theoretical model of particle physics, and collected Nobel Prizes for doing so.
The American footprint in that era of physics is forever inscribed at the upper end of the periodic table. Element number 95 is americium; number 97 is berkelium; number 98 is californium; number 103 is lawrencium, for Ernest O. Lawrence, the American physicist who invented the first particle accelerator; and number 106 is seaborgium, for Glenn T. Seaborg, the American physicist whose lab at the University of California, Berkeley, discovered ten new elements heavier than uranium.
Ever-larger accelerators reach ever higher energies, probing the fast receding boundary between what is known and unknown about the universe. The big bang theory of cosmology asserts that the universe was once a very small and very hot soup of energetic subatomic particles. With a superduper particle-smasher, physicists might be able to simulate the earliest moments of the cosmos. In the 1980s, when U.S. physicists proposed just such an accelerator (eventually dubbed the Superconducting Super Collider), Congress was ready to fund it. The U.S. Department of Energy was ready to oversee it. Plans were drawn up. Construction began. A circular tunnel fifty miles around (the size of Washington DC’s beltway) was dug in Texas. Physicists were eager to peer across the next cosmic frontier. But in 1993, when cost overruns looked intractable, a fiscally frustrated Congress permanently withdrew funds for the $11 billion project. It probably never occurred to our elected representatives that by canceling the Super Collider they surrendered America’s primacy in experimental particle physics.
If you want to see the next frontier, hop a plane to Europe, which seized the opportunity to build the world’s largest particle accelerator and stake a claim of its own on the landscape of cosmic knowledge. Known as the Large Hadron Collider, the accelerator will be run by the European Center for Particle Physics (better known by an acronym that no longer fits its name, CERN). Although some U.S. physicists are collaborators, America as a nation will watch the effort from afar, just as so many nations have done before.