Natural History Magazine
Sometimes innovation gets interrupted.
Human ingenuity seldom fails to improve on the fruits of human invention. Whatever may have dazzled everyone on its debut is almost guaranteed to be superseded and, someday, to look quaint.
In 2000 BC a pair of ice skates made of polished animal bone and leather thongs was a transportation breakthrough. In 1610 Galileo’s eight-power telescope was an astonishing tool of detection, capable of giving the senators of Venice a sneak peek at hostile ships before they could enter the lagoon. In 1887 the one-horsepower Benz Patent Motorwagen was the first commercially produced car powered by an internal combustion engine. In 1946 the thirty-ton, showroom-size ENIAC, with its 18,000 vacuum tubes and 6,000 manual switches, pioneered electronic computing. Today you can glide across roadways on inline skates, gaze at images of faraway galaxies brought to you by the Hubble Space Telescope, cruise the autobahn in a 600-horsepower roadster, and carry your three-pound laptop to an outdoor cafe.
Of course, such advances don’t just fall from the sky. Clever people think them up. Problem is, to turn a clever idea into reality, somebody has to write the check. And when market forces shift, those somebodies may lose interest and the checks may stop coming. If computer companies had stopped innovating in 1978, your desk might still sport a hundred-pound IBM 5110. If communications companies had stopped innovating in 1973, you might still be schlepping a two-pound, nine-inch-long cell phone. And if in 1968 the US space industry had stopped developing bigger and better rockets to launch humans beyond the Moon, we’d never have surpassed the Saturn V rocket.
Oops!
Sorry about that. We haven’t surpassed the Saturn V. The largest, most powerful rocket ever flown by anybody, ever, the thirty-six-story-tall Saturn V was the first and only rocket to launch people from Earth to someplace else in the universe. It enabled every Apollo mission to the Moon from 1969 through 1972, as well as the 1973 launch of Skylab 1, the first US space station.
Inspired in part by the successes of the Saturn V and the momentum of the Apollo program, visionaries of the day foretold a future that never came to be: space habitats, Moon bases, and Mars colonies up and running by the 1990s. But funding for the Saturn V evaporated as the Moon missions wound down. Additional production runs were canceled, the manufacturers’ specialized machine tools were destroyed, and skilled personnel had to find work on other projects. Today US engineers can’t even build a Saturn V clone.
What cultural forces froze the Saturn V rocket in time and space? What misconceptions led to the gap between expectation and reality?
Soothsaying tends to come in two flavors: doubt and delirium. It was doubt that led skeptics to declare that the atom would never be split, the sound barrier would never be broken, and people would never want or need computers in their homes. But in the case of the Saturn V rocket, it was delirium that misled futurists into assuming the Saturn V was an auspicious beginning—never considering that it could, instead, be an end.
On December 30, 1900, for its last Sunday paper of the nineteenth century, the Brooklyn Daily Eagle published a sixteen-page supplement headlined THINGS WILL BE SO DIFFERENT A HUNDRED YEARS HENCE.
The contributors—business leaders, military men, pastors, politicians, and experts of every persuasion—imagined what housework, poverty, religion, sanitation, and war would be like in the year 2000. They enthused about the potential of electricity and the automobile. There was even a map of the world-to-be, showing an American Federation comprising most of the Western Hemisphere from the lands above the Arctic Circle down to the archipelago of Tierra del Fuego—plus sub-Saharan Africa, the southern half of Australia, and all of New Zealand.
Most of the writers portrayed an expansive future. But not George H. Daniels, a man of authority at the New York Central and Hudson River Railroad, who peered into his crystal ball and boneheadedly predicted:
It is scarcely possible that the twentieth century will witness improvements in transportation that will be as great as were those of the nineteenth century.
Elsewhere in his article, Daniels envisioned affordable global tourism and the diffusion of white bread to China and Japan. Yet he simply couldn’t imagine what might replace steam as the power source for ground transportation, let alone a vehicle moving through the air. Even though he stood on the doorstep of the twentieth century, this manager of the world’s biggest railroad system could not see beyond the automobile, the locomotive, and the steamship.
Three years later, almost to the day, Wilbur and Orville Wright made the first-ever series of powered, controlled, heavier-than-air flights. By 1957 the USSR launched the first satellite into Earth orbit. And in 1969 two Americans became the first human beings to walk on the Moon.
Daniels is hardly the only person to have misread the technological future. Even experts who aren’t totally deluded can have tunnel vision. On page 13 of the Eagle’s Sunday supplement, the principal examiner at the US Patent Office, W. W. Townsend, wrote, The automobile may be the vehicle of the decade, but the air ship is the conveyance of the century.
Sounds visionary, until you read further. What he was talking about were blimps and zeppelins. Both Daniels and Townsend, otherwise well-informed citizens of a changing world, were clueless about what tomorrow’s technology would bring.
Even the Wrights were guilty of doubt about the future of aviation. In 1901, discouraged by a summer’s worth of unsuccessful tests with a glider, Wilbur told Orville it would take another fifty years for someone to fly. Nope: the birth of aviation was just two years away. On the windy, chilly morning of December 17, 1903, starting from a North Carolina sand dune called Kill Devil Hill, Orville was the first to fly the brothers’ 600-pound plane through the air. His epochal journey lasted twelve seconds and covered 120 feet—a distance just shy of the wingspan of a Boeing 757.
Judging by what the mathematician, astronomer, and Royal Society gold medalist Simon Newcomb had published just two months earlier, the flights from Kill Devil Hill should never have taken place when they did:
Quite likely the twentieth century is destined to see the natural forces which will enable us to fly from continent to continent with a speed far exceeding that of the bird.
But when we inquire whether aerial flight is possible in the present state of our knowledge; whether, with such materials as we possess, a combination of steel, cloth and wire can be made which, moved by the power of electricity or steam, shall form a successful flying machine, the outlook may be altogether different.
Some representatives of informed public opinion went even further. The New York Times was steeped in doubt just one week before the Wright brothers went aloft in the original Wright Flyer. Writing on December 10, 1903—not about the Wrights but about their illustrious and publicly funded competitor, Samuel P. Langley, an astronomer, physicist, and chief administrator of the Smithsonian Institution—the Times declared:
We hope that Professor Langley will not put his substantial greatness as a scientist in further peril by continuing to waste his time, and the money involved, in further airship experiments. Life is short, and he is capable of services to humanity incomparably greater than can be expected to result from trying to fly.
You might think attitudes would have changed as soon as people from several countries had made their first flights. But no. Wilbur Wright wrote in 1909 that no flying machine would ever make the journey from New York to Paris. Richard Burdon Haldane, the British secretary of war, told Parliament in 1909 that even though the airplane might one day be capable of great things, from the war point of view, it is not so at present.
Ferdinand Foch, a highly regarded French military strategist and the supreme commander of the Allied forces near the end of the First World War, opined in 1911 that airplanes were interesting toys but had no military value. Late that same year, near Tripoli, an Italian plane became the first to drop a bomb.
Early attitudes about flight beyond Earth’s atmosphere followed a similar trajectory. True, plenty of philosophers, scientists, and sci-fi writers had thought long and hard about outer space. The sixteenth-century philosopher-friar Giordano Bruno proposed that intelligent beings inhabited an infinitude of worlds. The seventeenth-century soldier-writer Savinien de Cyrano de Bergerac portrayed the Moon as a world with forests, violets, and people.
But those writings were fantasies, not blueprints for action. By the early twentieth century, electricity, telephones, automobiles, radios, airplanes, and countless other engineering marvels were all becoming basic features of modern life. So couldn’t earthlings build machines capable of space travel? Many people who should have known better said it couldn’t be done, even after the successful 1942 test launch of the world’s first long-range ballistic missile: Germany’s deadly V-2 rocket. Capable of punching through Earth’s atmosphere, it was a crucial step toward reaching the Moon.
Richard van der Riet Woolley, the eleventh British Astronomer Royal, is the source of a particularly woolly remark. When he landed in London after a thirty-six-hour flight from Australia, some reporters asked him about space travel. It’s utter bilge,
he answered. That was in early 1956. In early 1957 Lee De Forest, a prolific American inventor who helped birth the age of electronics, declared, Man will never reach the moon, regardless of all future scientific advances.
Remember what happened in late 1957? Not just one but two Soviet Sputniks entered Earth orbit. The space race had begun.
Whenever someone says an idea is “bilge” (which, I suppose, is British for “baloney”), you must first ask whether it violates any well-tested laws of physics. If so, the idea is likely to be bilge. If not, the only challenge is to find a clever engineer—and, of course, a committed source of funding.
The day the Soviet Union launched Sputnik 1, a chapter of science fiction became science fact, and the future became the present. All of a sudden, futurists went overboard with their enthusiasm. The delerium that technology would advance at lightning speed replaced the delusion that it would barely advance at all. Experts went from having much too little confidence in the pace of technology to having much too much. And the guiltiest people of all were the space enthusiasts.
Commentators became fond of twenty-year intervals, within which some previously inconceivable goal would supposedly be accomplished. On January 6, 1967, in a front-page story, The Wall Street Journal announced: The most ambitious US space endeavor in the years ahead will be the campaign to land men on neighboring Mars. Most experts estimate the task can be accomplished by 1985.
The very next month, in its debut issue, The Futurist magazine announced that according to long-range forecasts by the RAND Corporation, a pioneer think-tank, there was a 60 percent probability that a manned lunar base would exist by 1986. In The Book of Predictions, published in 1980, the rocket pioneer Robert C. Truax forecast that 50,000 people would be living and working in space by the year 2000. When that benchmark year arrived, people were indeed living and working in space. But the tally was not 50,000. It was three. The first crew of the International Space Station.
All those visionaries (and countless others) never really grasped the forces that drive technological progress. In Wilbur and Orville’s day, you could tinker your way into major engineering advances. Their first airplane did not require a grant from the National Science Foundation: they funded it through their bicycle business. The brothers constructed the wings and fuselage themselves, with tools they already owned, and got their resourceful bicycle mechanic, Charles E. Taylor, to design and hand-build the engine. The operation was basically two guys and a garage.
Space exploration unfolds on an entirely different scale. The first moonwalkers were two guys, too—Neil Armstrong and Buzz Aldrin—but behind them loomed the force of a mandate from an assassinated president, 10,000 engineers, $100 billion, and a Saturn V rocket.
Notwithstanding the sanitized memories so many of us have of the Apollo era, Americans were not first on the Moon because we’re explorers by nature or because our country is committed to the pursuit of knowledge. We got to the Moon first because the United States was out to beat the Soviet Union, to win the Cold War any way we could. John F. Kennedy made that clear when he complained to top NASA officials in November 1962:
I’m not that interested in space. I think it’s good, I think we ought to know about it, we’re ready to spend reasonable amounts of money. But we’re talking about these fantastic expenditures which wreck our budget and all these other domestic programs and the only justification for it in my opinion to do it in this time or fashion is because we hope to beat them [the Soviet Union] and demonstrate that starting behind, as we did by a couple of years, by God, we passed them.
Like it or not, war (cold or hot) is the most powerful funding driver in the public arsenal. When a country wages war, money flows like floodwaters. Lofty goals—such as curiosity, discovery, exploration, and science—can get you money for modest-size projects, provided they resonate with the political and cultural views of the moment. But big, expensive activities are inherently long term, and require sustained investment that must survive economic fluctuations and changes in the political winds.
In all eras, across time and culture, only three drivers have fulfilled that funding requirement: war, greed, and the celebration of royal or religious power. The Great Wall of China; the pyramids of Egypt; the Gothic cathedrals of Europe; the US interstate highway system; the voyages of Columbus and Cook—nearly every major undertaking owes its existence to one or more of those three drivers. Today, as the power of kings is supplanted by elected governments, and the power of religion is often expressed in non-architectural undertakings, that third driver has lost much of its sway, leaving war and greed to run the show. Sometimes those two drivers work hand in hand, as in the art of profiteering from the art of war. But war itself remains the ultimate and most compelling rationale.
Having been born the same week NASA was founded, I was eleven years old during the voyage of Apollo 11, and had already identified the universe as my life’s passion. Unlike so many other people who watched Neil Armstrong’s first steps on the Moon, I wasn’t jubilant. I was simply relieved that someone was finally exploring another world. To me, Apollo 11 was clearly the beginning of an era.
But I, too, was delirious. The lunar landings continued for three and a half years. Then they stopped. The Apollo program became the end of an era, not the beginning. And as the Moon voyages receded in time and memory, they seemed ever more unreal in the history of human projects.
Unlike the first ice skates or the first airplane or the first desktop computer—artifacts that make us all chuckle when we see them today—the first rocket to the Moon, the 364-foot-tall Saturn V, elicits awe, even reverence. Three Saturn V relics lie in state at the Johnson Space Center in Texas, the Kennedy Space Center in Florida, and the US Space and Rocket Center in Alabama. Streams of worshippers walk the length of each rocket. They touch the mighty rocket nozzles at the base, like the apes who touched the Monolith in the 1968 film 2001: A Space Odyssey, and wonder how something so large could ever have bested Earth’s gravity. To transform their awe into chuckles, our country will have to resume the effort to “boldly go where no man has gone before.” Only then will the Saturn V look as quaint as every other invention that human ingenuity has paid the compliment of improving upon.