The years between the first hydrogen bomb
tests and the Limited Test Ban Treaty in 1963 saw more than just
increased anxiety about the effects of nuclear testing on weather. They
also saw increased interest in large-scale, purposeful environmental
modification. Most climate modification enthusiasts spoke of increasing
global temperatures, in the hopes that this would increase the quantity
of cultivated land and make for fairer weather. Some suggested
blackening deserts or snowy areas, to increase absorption of radiation.
Covering large areas with carbon dust, so the theory went, would raise
temperatures. Alternatively, if several hydrogen bombs were exploded
underwater, they might evaporate seawater and create an ice cloud that
would block the escape of radiation. Meteorologist Harry Wexler had
little patience for those who wanted to add weather and climate
modification to the set of tools in man’s possession. But by 1958 even he
acknowledged that serious proposals for massive changes, using nuclear
weapons as tools, were inevitable. Like most professional
meteorologists, in the past he had dismissed the idea that hydrogen
bombs had affected the weather. But with the prospect of determined
experiments designed to bring about such changes, he warned of “the
unhappy situation of the cure being worse than the ailment.”
Whatever
one might have thought about the wisdom of tinkering with the weather
in peacetime, the manipulation of nature on a vast scale for military
purposes seemed to be a perfectly legitimate application of scientific
knowledge. While planning a total war against the Soviet Union, every
avenue begged for exploration. Let’s explore how the scientific advisors
of America’s key allies in NATO saw the alliance fighting in the future.
Numerous ideas for creating catastrophic events through natural
processes were presented, especially using hydrogen bombs as triggers.
In these discussions, held as early as 1960, top scientists debated the
fundamental environmental question — can humans have a long-term effect
on the global environment?
The
desire for novel military technology seemed especially urgent by the
early 1960s. Although officially part of the International Geophysical
Year, the Soviet Union’s launch of Sputnik in October 1957 had clear
military ramifications. Not only did it begin the space race but it also
took the arms race to a new stage that included communications
satellites and intercontinental ballistic missiles. The launch of
Sputnik made the world seem smaller and made the most far-fetched
visions of the future seem possible. The gee-whiz, Buck Rogers feel of
the immediate postwar years returned. But this wave of technological
enthusiasm was darker, because instead of coming on the tide of a war
victory, it came as a foreboding new competition. For years the
Americans had been preparing for the missile age, gathering data on the
atmosphere and on the earth’s gravity over the poles. The Soviets
clearly had kept the pace. Sputnik served as a justification for a vast
array of projects to use scientific knowledge to tamper with nature on a
large scale.
Reinforcing the sense of urgency, President
Eisenhower’s special committee on weather modification submitted its final
report in January 1958, just months after Sputnik’s launch. The
committee’s chairman, retired Navy Captain Howard T. Orville, said at a
press conference that he suspected that the Soviets already had begun a
large, secret program on weather control. Despite routine dismissals of
the idea throughout the decade by meteorologists, the high-level
committee ranked weather control ahead of hydrogen bombs and satellites
in military significance. Orville urged the government to support
research on controlling large-scale weather systems, not just
rainmaking. He further suggested that finding ways to manipulate the heat
balance between the sun and earth might be the key to weather and
climate control. The earth already had been heated up by man’s efforts,
by introducing carbon dioxide into the atmosphere through the burning of
fossil fuels. This carbon dioxide helped to trap the heat and create,
as the New York Times put it, a “greenhouse effect.” It might be possible
to harness this greenhouse effect. “If such steps are feasible,”
journalist John Finney reported, “then New York City might be put under a
few hundred feet of ice or a few hundred feet of water depending on
whether the temperature was raised or lowered.”
Rumors spread
quickly about scientists in the United States and Soviet Union
experimenting with unprecedented tools for controlling nature. Were the
Soviets planning to dam the Bering Strait? Were the Americans able to
steer storms? Naysayers pointed out that meteorologists could not even
predict naturally occurring weather, so how could anyone control it? One
author opined in the New York Times, “For would it not be foolish for
anyone to talk of controlling an intricate piece of apparatus until he
knew precisely how it worked?” After the report of Eisenhower’s special
committee was made public, scientists in allied countries received
strange, sheepish letters from their defense establishments, asking if
the latest rumors about American research could be true. For example, a
British Air Ministry scientific advisor, E. V. Truefitt, presented his
countryman, oceanographer George Deacon, with “one or two questions
which have come up in odd conversations.” He called them “wild cat”
ideas that he did not really take seriously, yet they appeared to be in
discussion in the United States. Despite his instinct that they could
not possibly be real, he felt obligated to run them by a competent man
of science.
One of the ideas was to melt the polar ice cap by
exploding nuclear weapons on it, thus raising the global sea level. The
Soviets might be considering it, so the rumor went, to drown cities in
the United States and Western Europe. Another idea was to change ocean
currents or temperatures to interfere with an enemy’s climate and food
production. Truefitt had no idea how assess an ocean-initiated climate
change, but he had made a rough calculation to determine what was needed
to melt the polar ice cap. He believed that it would take about a
million tons of fissile material to melt enough to raise sea level by 30
feet. “This is a large amount of fissile material whichever way you look
at it,” he wrote to Deacon, “and consequently my guess is that it is not
the kind of project that even the Americans would embark on under the
influence of Sputniks.”
Desperate to find “weapon of the future”
The
truth was that the immediate post-Sputnik years had a peculiar air,
both of desperation and of opportunity. Doors were wide open to a range
of technological possibilities. Nearly anything that was technically
feasible made it to the highest levels of discussion. For starters, that
meant revisiting the questions surrounding biological, chemical, and
radiological weapons. But it also sparked discussion of the ambitious,
the horrendous, and the quirky. Like wildcatters exploring for oil,
American scientists grasped desperately around them, striving to find the
next weapon of the future.
There were several post-Sputnik efforts
to push the limits of the “possible,” to explore exotic ideas that
might prove decisive 5, 10, or 20 years into the future. Some actions to
direct this scientific work were high profile and public. President
Eisenhower created a science advisory committee to guide the course of
American technology and ensure that the Americans did not fall behind
the Soviet Union. This President’s Science Advisory Committee (PSAC)
also existed to rein in some of the wilder ideas, to avoid wasteful
spending. Other brain trusts, often dominated by physicists with
expertise in nuclear affairs, sprang up behind closed doors to advise
military establishments. One of these was “JASON,” an elite group of
scientists who got together during the summer months to assess major
scientific and technological problems of military significance. Paid by
government contract through several different bodies throughout its
existence, “the Jasons,” as they called themselves, were drawn from the
cream of civilian academic science. Despite their outsider status, the
Jasons gained the respect and trust of officials in the Defense Department
and the armed services, and their advice often revolutionized military
thinking during the nuclear era.
Sputnik did not just spark new
scientific projects, however. It also revolutionized military strategy,
making it grimmer than ever. The American and Soviet air forces realized
they were going to have to rethink the basic notion of national
vulnerability. No longer could the Air Force’s Strategic Air Command
count on scrambling bombers and flying them over the North Pole. Most of
the war’s damage would have been done before bombers left the Western
Hemisphere.
More than that, as a secret National Academy of
Sciences group advised the Air Force in 1958, the range of possible
wars soon would expand exponentially. Conflicts were going to become both
more total and more limited at the same time. On the one hand, the
United States was losing its ability to incapacitate enemy forces. In
practice that meant that the most attractive targets all over the world
would be centers of population — cities — rather than armies or
airfields. Making an effective attack against enemy military forces seemed
a dwindling prospect in an era when missiles could be put into hardened
silos, mobile rocket units, or submarines patrolling the oceans.
Cities, by contrast, would be ripe for plucking. “Weapon yields,
delivery accuracies, and force level requirements for city destruction
are modest,” these scientists concluded, while attacking heavily
fortified bunkers would require large and accurate payloads. That meant
that finding ways of maximizing civilian death would assume an even
greater importance than it already had.
On the other hand, nuclear
parity would make full-blown conflict less likely, meaning that all of
the armed services would have to reorient themselves back to
conventional warfare. As RAND Corporation game theorists had long
feared, the atomic bomb was a wasting asset—and the window of
opportunity to “win” decisively in a war against the Soviet Union had
passed. By the late 1950s, the new orthodoxy in strategic thinking
accepted that the Soviet Union was committed to avoiding a nuclear
holocaust and that it intended to encourage “brushfire” wars instead.
Small wars like those in Malaya and Korea would become more common. As
the 1960s dawned, military strategists wondered about the fate of
Vietnam, which the French had failed to hold. By treaty the country had
split between North and South. Should the communist North Vietnamese
invade, would the Americans consider using nuclear weapons?
Some may
have argued that nuclear bombs were America’s answer to the human
population imbalance against, say, people in China or Southeast Asia.
But new studies at RAND had dismissed this possibility, showing that
nuclear weapons would be ineffective against guerrilla forces in
Southeast Asia and would visit enormous collateral damage upon friendly
population centers. So the military would need to let go of President
Eisenhower’s preferred strategy of massive retaliation as America’s
basic posture.
The Air Force would have to stop relying on
aircraft designed purely to deliver nuclear weapons. Instead, it would
need to find ways of fighting men, tanks, rockets, and airplanes—all
without nuclear weapons. A decade earlier the Navy had bitterly opposed
the Air Force’s claims that the era of aircraft carriers and battleships
had ended. Now it seemed that the Navy had been right. The new
conventional wisdom, which President Kennedy (a former Navy man) soon
would establish as the doctrine of “flexible response,” was that the
nations with the greatest range, flexibility, and cleverness in weapons
systems would stand strongest. This meant conducting research on weapons
at various levels of destruction up to and including nuclear bombs and
being creative about their uses.
It also meant combining modes of
warfare across scientific disciplines. Geophysical and biological
knowledge might be united, for example, in developing dispersal
mechanisms for pathogens. In trying to achieve large area coverage, one
might fall back on cloud-seeding techniques—with the important difference
that the “seeds” would not be silver iodide to cause rain but pathogens
to spread disease far and wide. For example, certain phenomena in air
masses, such as “Polar Outbreaks” (thrusts of cold air from the poles
toward the equator), seemed to have great potential for such seeding,
especially given the Soviet Union’s meteorological vulnerability from
the north.
Military research embraces science-fiction
The
post-Sputnik national pall of gloom encouraged American scientists to
explore unorthodox weapons, and they left no stone unturned. The U.S.
military forged ahead with research on weapons using radiation, particle
beams, nuclear energy, and kinetic energy. The Army Chemical Corps even
investigated the use of lysergic acid diethylamide (LSD) and cannabis
as non-lethal, incapacitating agents. The National Academy of Sciences
noted this approvingly in 1958 and suggested that the Air Force begin
administering LSD to airmen as soon as possible, to judge whether to add
it to the arsenal of chemical weapons.
With so many wide-ranging
ideas being vetted, NATO allies worried that the Americans were moving
in too many directions at once. It was fine to support science in the
United States and to speak grandly about possibly controlling forces of
nature—but which ideas could be incorporated into actual NATO war plans?
In 1960 NATO members agreed to convene a special group of scientists
and military leaders to assess the long-term prospects of war. They
wanted to know what would really be feasible by the 1970s, and what was
just science fiction.
This kind of science forecasting was not just
a matter of intelligent people guessing the future. By 1960 it had a
distinguished history of shaping policy, particularly in some parts of
the American military establishment. The Air Force, for example,
understood in the 1950s that much of its strength relied on continuous
research and development (R&D). Toward the end of World War II,
General Henry “Hap” Arnold, commander of the then-Army Air Force,
famously said that “for twenty years the Air Force was built around
pilots, pilots, and more pilots. . . . The next twenty years is going to
be built around scientists.” Throughout the Cold War, such brain
trusts—in think tanks like RAND, secret groups like JASON, and many
others—exercised a remarkable influence on policies.
When NATO
tried, in 1960, to estimate the next 10 to 15 years of weapons, it
enlisted the leadership of Theodore von Kármán, the grand old man of
science forecasting. By then he was 79 years old. Born in Hungary, von
Kármán had been one of the world’s foremost experts in aerodynamics. He
even had helped the Austrian military design aircraft during the First
World War. In 1929 he came to the United States to head up an
aeronautical laboratory at Caltech, helping to kick-start the aviation
industry in southern California. Acting as scientific advisor to United
States air forces during World War II, von Kármán had initiated a
long-term study of air power that amassed some of the best brains in
physics and aeronautics. The resultant report, Where We Stand, became a
road map for postwar air power research. In subsequent years, von Kármán
repeated this process with other studies, and in fact he chaired the
1958 secret committee advising the Air Force, under the auspices of the
National Academy of Sciences. In 1960 he embarked on a study that would
be the capstone of his long career: NATO’s attempt to grasp the future
face of battle over the entire earth.
Battle: earth
Known
simply as the Von Kármán Committee, the new group included the chief
scientific advisor of each national defense organization in the United
States, Britain, Canada, France, and West Germany. With several working
groups of scientists under them, they ran the gamut of new weapons in an
era of “total war.” They included the typical range of military
subjects, including aircraft, weaponry, and ships. But they also delved
deeply into the implications of the global physical environment,
particularly in light of the extraordinary size of thermonuclear
weapons, the global reach of ballistic missiles, and the extent of
global monitoring begun during the International Geophysical Year.
The
buzzword of the IGY had been “synoptic.” Taken literally, it meant
observing more than one place at the same time—viewing together. The
IGY’s concept was to take a huge number of observations, spread out over
a variety of geophysical disciplines and geographic areas, all within
an 18-month period. Doing so would provide a portrait of the earth that
was
more true and comprehensive than anything ever attempted.
The
Von Kármán Committee adopted the word “synoptic” too, but applied it to
weapons. Weapons of a “synoptic scale” meant control and domination of
whole physical systems. In military shorthand, the word synoptic called
to mind vastness, encompassing large portions of the earth—or perhaps
all of it. The IGY had brought this idea into military planners’ field of
vision. But while the IGY was concerned with synoptic-scale
measurement, NATO was concerned with synoptic-scale manipulation.
Once
they began to meet, the members of the Von Kármán Committee realized
that they all agreed on at least one thing: the global observations
initiated in the IGY would have to continue indefinitely. The geophysical
factors of modern war involved knowledge of an operational
environment—in other words, how would the sea, land, or air affect troops
and ships? NATO forces needed to be able to operate in any kind of
environment. If it was on planet Earth, NATO should be prepared to fight
there and win.
In fact the U.S. armed services already were
developing environment-specific training centers to give American forces
mastery of three classes of extreme conditions: polar, desert, and
jungle. Given that the northern polar region was “the only large
uncommitted area lying between the world’s strongest antagonists,” polar
operations weighed heavily on defense planners’ minds. Already polar
and Arctic training centers existed at locations in Greenland, Canada,
and in the state of Alaska. The United States also operated a desert
warfare center in Yuma, Arizona. Still needed were centers approximating
Mediterranean conditions and tropical ones.
To take advantage of
the apparent shrinkage of the earth due to ballistic missiles, NATO
advisors also pointed out the need to revolutionize the field of
geodesy—earth measurement. Mapmakers relied on data taken from a variety
of oceanic or terrestrial expeditions, sometimes decades or more old.
No one had seen the earth from space, much less taken accurate
measurements based on satellites. Intercontinental ballistic missiles
would require precision. But NATO literally did not know where the
Soviet Union was. “On a world wide scale, we are not sure of the
position of North America in relation to the Eurasian continent.”
Knowledge of anything in the Southern Hemisphere was even less accurate.
The only decent data came from the Americas, Western Europe, Japan, and
some of the former and current European colonial territories. The
Soviets could target the West with accuracy, but the West could not do
the same. Any kind of exact targeting of the Soviet Union would prove
impossible before satellites could take comprehensive measurements. In
the meantime, constant earth measurement from the air would prove
essential. Fortunately, international scientific projects were providing
that data.
The IGY had convinced scientists and military planners
of the usefulness of synoptic data collection. If done in real time, or
close to it, data collection could be automated and collected over a
large territory, perhaps even globally. Individual scientists might
never analyze the vast amounts of data, but the data could be fed into
computers in order to monitor and
predict environmental conditions.
Already the Americans were working on an anti-submarine warfare
“environmental prediction system.” It collected oceanographic
information—to estimate sonar performance—and combined it with
meteorological information to predict future oceanographic conditions.
Had
the members of the Von Kármán Committee been military historians, there
is little doubt about what they would have cast as the “decisive
moment” in the history of global strategy. Time and again they called to
mind the changes brought about by the advent of earth-orbiting
satellites. It would prove to be, they believed, a dividing line between
military eras. It promised total monitoring of the global environment, a
vision of the future that was pervasive across the range of sciences
and military operations. By 1970, these NATO advisors predicted,
scientists would be able to identify and track thunderstorms as they
occurred all over the entire earth and to keep the earth’s radiation
under constant surveillance. Old charts would be discarded, in favor of a
constantly refreshing set of data beamed down from the heavens.
Automated data systems would be necessary to achieve accuracy of
measurement and improved forecasting. As the committee put it: “The
concept of inaccessible geographical areas is no longer
valid—observations over enemy-held, oceanic and uninhabited areas are as
easily made as elsewhere.” Reliance on existing charts and data,
collected laboriously by error-prone humans, rarely uniform from country
to country, seemed archaic. New methods of continuous, uniform data
collection of the oceans, land, and space would provide the kind of
mastery of the global environment that the Von Kármán committee
envisioned.
Climate change as warfare
Aside
from this unprecedented ability to forecast conditions and improve
global accuracy, the NATO science advisors also predicted ambitious,
large-scale manipulation of the environment. The brass ring of military
geophysics was weather control. Scientists already had achieved modest
results in increasing rainfall or dissipating fogs. But these successes
required optimal conditions and certainly could not be projected over a
large area or from a long distance. But what about climate control?
In
a 1956 Fortune article, mathematician John von Neumann had suggested
that militaries would be able to make large-scale changes to climate. He
pointed out various ways to alter oceans and seas. One was to blanket
ice sheets with blackening agents, to absorb more light and melt them.
If it could be done to Greenland, its ice sheet alone would raise sea
levels by about 10 feet “and cause great discomfort to most world
ports.” Another scheme was to divert the Gulf Stream, which would
severely change the climate of Northern Europe. Still another idea was
to dam the Bering Strait. Such alterations would have clear, long-term
effects on world climate. And these changes seemed possible. Reflecting on
von Neumann’s predictions, the NATO group believed that an
extraordinary tool lay in the hands of military planners: the hydrogen
bomb. “It is perhaps true,” the committee concluded, “that means
presently within man’s reach could be employed so as to alter global
climate for long periods.”
Given the later controversy about the
role of carbon dioxide in inducing global climate change, the focus on
the hydrogen bomb might seem surprising. But the reason for this was
simple. Advised by physicists, the defense establishments of NATO’s
strongest members believed that in order for “synoptic scale” weapons to
be feasible, man had to achieve physical power that was comparable to
nature’s power. The only tool that seemed likely to provide that was the
hydrogen bomb. Although professional meteorologists had insisted that
hydrogen bomb tests had not created the extreme winters of 1954, 1958,
and 1962, these military advisors were less adamant. They knew that the
energies of nature were vast, but felt they might be shaped by man. It
seemed that the Soviets were working hard on the problem. Canadian
scientists repeated the oft-heard rumor that the Soviets were planning
large-scale manipulation of the oceans, along with drastic modification
of climate, by damming up the Bering Strait. The Canadians reasoned:
surely the Russians had in mind the use of nuclear bombs?
NATO scientists found the prospects of such power over nature intriguing. They called it
environmental warfare.
“This kind of warfare has the peculiarity that it could look like our
image of nuclear war, or could be so subtle that the ‘weapons’ and
‘battles’ are hard to identify.” The enemy might undertake a vast
engineering project to change the climate of a whole region, “leading
gradually to economic ruin and loss of strength.” This could be done
even without declaring war.
Once again ecological vulnerability
emerged as a crucial area in need of study for military purposes. The
NATO science advisors did not yet understand their true vulnerability to
what they called “living weapons.” But new data were coming in. Since
the late 1950s, American engineers had planned to use thermonuclear
explosions to excavate a harbor in Alaska—a project dubbed “Plowshare.”
Beforehand they put together what today might be called an environmental
impact statement and discovered that the effect on the Eskimos’ diet
might not be as negligible as originally assumed. For this and other
reasons, the project was scrapped.
But that knowledge had been
useful for military thinking. Scientists had traced the pathway of
radioactivity through the food chain. NATO scientists now used the
example of the Eskimos’ ecosystem to argue for more advanced knowledge
of ecological warfare. Within that ecosystem, Eskimos lived
interdependently with seals, otter, fish, caribou, and plankton. If the
plankton were all killed, an Eskimo’s ecological community would be
utterly destroyed. “At best he would have to move,” the group pointed
out. “At worst he would die.” This kind of thinking could be tailored to
particular regions: “The people of Asia depend on rice and a very few
other crops. Something like a lethal rice-rust or blight could make life
in Asia much more difficult and perhaps untenable.”
As a weapon
system, ecological links went further than killing—they also promised
biological coercion. Destruction of the enemy need not be the goal.
Getting rid of plankton, for example, would make the Eskimos’ entire
food system collapse and force them to be entirely dependent on food
supplied from outside the region. To achieve this, toxic agents “may be
developed to attack essential links in various ecological chains.” The
aim would be to shape an existing interdependent web along new lines,
“to force the ecology to accept dependence on some crop or animal which
cannot live at all in the homeland.” Doing this would put the victim in
an extremely disadvantageous position, “leading to a gradual loss of
power and position and inevitable vassalage.”
Von Kármán died
shortly after the first of his committee reports was completed.
As
colleagues remembered his contributions to aeronautics and to scientific
advising, his death lent the committee’s findings an extraordinary amount
of authority within NATO. The reports had the air of a final act of
service; the chairman’s passing only augmented the committee’s
importance. With Von Kármán gone, the reports themselves were a
foreboding, Cassandra-like vision of the future that military planners
could ignore only at their peril. This was especially true of subjects
that the committee felt it did not yet understand fully.
Environmental warfare becomes real
Environmental
warfare had captured the imagination of the committee but the results
had been unsatisfying. It seemed in keeping with the direction of
science—toward global, synoptic-scale activities. Yet it was unclear how
it might shape weaponry. The experience of the Von Kármán Committee
established “environmental warfare” as a distinct concept, and it was
not long before NATO reconvened the members to look into the subject
more fully. They realized that there were commonalities between the work
on geophysics and the ongoing work on radiological, biological, and
chemical weapons. Both involved alterations to the natural world with
potentially devastating human consequences. Military technology seemed
on the verge of an unprecedented ability to tap the forces of nature on a
massive scale.
Thus in late 1962, NATO summoned scientists and
military planners to Paris to hammer out what might legitimately come
out of “environmental warfare” and what the long-term consequences might
be. The man who tried to fill Von Kármán’s shoes was another Hungarian,
nuclear physicist Edward Teller, who joined the group as a “special
advisor.” Known widely as the father of the hydrogen bomb, Teller
already was deeply committed to using nuclear explosions for massive
earthmoving projects, such as the construction of harbors. He also saw
great potential in developing novel uses of nuclear weapons in wartime.
Along with Teller, committee members were drawn from national defense
establishments and from the U.S. Advanced Research Projects Agency
(ARPA).
The central question almost always remained the same: were
natural forces susceptible to human influence on a large, even global,
scale? In methodical fashion, these military planners broke down
environmental warfare into distinct spheres of possibility,
corresponding with the layers of the earth and its atmosphere as it
extended into space: lithosphere and hydrosphere (land and oceans),
troposphere (lower atmosphere), stratosphere and ionosphere (upper
atmosphere), and exosphere (outer space). Some of the earlier “wildcat”
ideas were quickly dispensed with as impractical, such as using hydrogen
bombs to melt the polar ice caps. But other wildcat ideas were
feasible, particularly using nuclear weapons as triggers for tsunamis in
the oceans, or for altering the weather.
One only had to open a
newspaper to see what natural catastrophes could accomplish. In 1958, in
Alaska’s Lituya Bay, there was a landslide so powerful that it carried
the energy equivalent to a one-kiloton explosion. In May 1960, a wall of
water smashed the Chilean coast over a stretch of several hundred
miles, with wave heights of 5.5 to 13.5 meters. The Chilean earthquake
sent storm waves across a large area of the Pacific at speeds in excess
of 400 miles per hour. Even as far away as Hawaii, low-lying areas were
flooded. Thousands of Chileans were killed, and millions were left
homeless. Reporters described the relentless devastation:
The
quakes went on for all of the week, demolishing or damaging thousands
of homes and other buildings, and burying some small communities under
landslides. Whole villages were swept away by tsunamis as high as
twenty-four feet. The quakes were so violent that mountains disappeared,
new lakes were formed and the earth’s surface dropped as much as 1,000
feet in twenty-five miles. The worst quake, last Sunday, released energy
of 240 megatons, equal to that of 1,200 atomic bombs of the type dropped
on Hiroshima and far more than the 174 megatons released by all the
nuclear explosions to date.
Noting deaths all over
the Pacific Rim, the New York Times reported that the Chilean earthquake
“gave tragic testimony that in this age of the conquest of the atom and
of triumphs in outer space man is still helpless against the vast and
still largely unpredictable forces that frequently go berserk in his
immediate environment—hurricanes, volcanoes and earthquakes.”
NATO
saw it differently. Environmental cataclysms could become part of the
alliance’s arsenal, with the help of a well-placed nuclear explosion.
The cascading effects of energy release from the existing instabilities
of nature could be, quite literally, earth shattering. The power over
nature was tempting: “The large engineering capability which is provided
by multi-megaton nuclear weapons might open up the possibility of
changing the course of ocean streams which are known to affect climate
and continents.” Narrow straits could indeed be dammed up, as some
feared the Soviets planned for the Bering Straits. Peninsulas could be
turned into islands, changing the patterns of water flow and mixing. With
enough nuclear bombs, the sea floor in some areas might be reconfigured
entirely.
Even weather control seemed poised to make a quantum
leap forward with the nuclear bomb as a tool. “Real weather control,”
NATO scientists argued, “would mean control of synoptic scale
disturbances—the centers of high pressure and low pressure found on the
daily weather maps.” Such large-scale systems seemed inherently
susceptible to influence, despite the huge energies required to do it.
The sun imparted energy into the air masses constantly, but only some of
it became kinetic energy. Most of the energy was stored, ready to be
released. The results could be quite violent, as in the case of
cyclones. A relatively small release of energy—say, a nuclear bomb—could
trigger a much larger release of natural energy.
One reason that
such widespread and even long-term changes in the earth’s systems seemed
feasible—at least in theory—was the growing realization of how serious
an effect humans already were having upon the upper atmosphere. High in
the sky, major effects seemed well within NATO’s grasp. Nuclear
explosions could create electron clouds some 70–90 kilometers up,
disrupting high-frequency communication. One of the leading researchers
on electron cloud disruption, Jerome Pressman, had been advising the
U.S. Army Signal Corps, the Air Force, and ARPA on this subject for
years. He told the rest of the environmental warfare committee that even
a single nuclear burst could disrupt long-distance communication over a
stretch of a thousand kilometers. If nuclear weapons were exploded in
the atmosphere as a defense against incoming missiles, the range of this
electron cloud would be vast indeed. High-frequency communication
equipment and long-distance radar systems might be rendered useless.
Out
in space—the exosphere—NATO saw great promise in the radiation belts
that American and Soviet satellites had measured during the
International Geophysical Year. The Van Allen belts were actually giant
regions of charged particles trapped by the earth’s magnetic field. They
were sources of intense, persistent radiation that endangered any
equipment or living thing in space. Although the Van Allen belts were
natural phenomena, similar belts could be created artificially by
exploding a nuclear weapon at an altitude of at least 400 kilometers.
Large bombs at even higher altitudes would create an extraordinarily
powerful radiation environment in space. The belts would cloak the
earth, challenging any exit or entrance by missile, satellite, or
spacecraft. Because the belts would be= trapped by the earth’s magnetic
field, there would be holes in the radiation cloak at the north and south
geomagnetic poles.
Whoever controlled these entry points would have
comparatively easy access to space. That would make the poles even more
important as strategic regions.
In fact, manipulation of the Van
Allen belts already had begun. In 1958 the United States discovered that
its high-altitude tests of “small kilotonnage” had created electron
shells around the earth, about 60 miles thick. Because the operation in
which these tests occurred had been dubbed “ARGUS,” the creation of the
shell became the “ARGUS effect.” Just a few months prior to these NATO
meetings, the United States detonated an even larger explosion at high
altitude—the “Starfish” experiment. As Edward Teller reported, “this is
the first time that the Argus effect was demonstrated on a really big
scale.” An immense number of electrons were caught in the earth’s
magnetic field and “are forming now a new Van Allen belt greater in
electron density than any of the known Van Allen belts.” He confided that
the electrons had damaged the solar cells in
American satellites.
“Why not just drop a bomb?”
Despite
their fascination with these weapons, the committee members struggled
to overcome the possibilities that defied the logic of nuclear warfare.
The military significance of triggering natural catastrophes was not
readily obvious. “If the weapon can be exploded a few miles offshore, it
can probably be delivered on, or close to, the target itself, and a far
larger proportion of the energy available would be expended on the
target and not on long lengths of unimportant coast line.” The same
argument could be made against any effort to influence the flow of ocean
currents and thus modify the world’s climate. Why not just drop a bomb
on a city? It seemed more logical.
On the other hand, there might
be great value in environmental devastation in a total war. NATO
advisors had already moved beyond “cities” as targets and had begun to
imagine much larger swathes of territory. Aside from the blast and
radioactive contamination, thermonuclear bombs could have wide-ranging
horrific consequences. Disruptions of dams and levees would lead to
widespread flooding. Drowning and starvation would result, posing a
serious threat to those who managed to survive the bombs.
The most
ghastly environmental threat was the prospect of large-scale fire. In
Whole World on Fire (2004), Lynn Eden has written that military planners
routinely ignored the consequences of huge firestorms caused by a
nuclear explosion’s thermal radiation. She suggests that this led
nuclear strategists to underestimate the catastrophic effects of nuclear
explosions throughout the Cold War. While war plans typically focused on
blast effects, not everyone ignored the totality of death and
destruction from fires. Some military planners considered it part of
environmental warfare. In the early 1960s, scientists and military
planners at the highest levels of NATO faced a stomach-churning analysis
that cast them as a way of arming the countryside against the enemy
even when his cities were destroyed.
These fires would
instantaneously ignite a huge area due to the explosion’s initial
thermal radiation, regardless of blast effects. Rather than just use
bombs directly against cities, one could explode a large bomb of about
100 megatons high in the atmosphere, at about 80 kilometers. Doing so
would maximize the amount of thermal radiation that would reach the
earth. Such radiation would ignite flammable material instantly, over an
area of nearly a million square kilometers. As a point of comparison,
the largest recorded forest fire in the United States occurred in 1871 in
Wisconsin and Michigan, which claimed 1,683 lives and spread over
15,000 square kilometers. Setting fire to forests, in an area of a
million square kilometers, would pose intractable problems to an enemy.
Outside the bombed-out cities, the countryside would provide no shelter,
no food, and no hope of survival.
A fire from thermal radiation
would differ from a typical forest fire because it would not need to
spread—instead, the whole area would go up in flames at the same time.
Oxygen would rapidly deplete, leaving any survivors suffocating to death.
It would be impossible to run from it. Rushes of air would create
firestorms with “strong winds of up to hurricane force,” far more intense
than the deadly firestorms created in German and Japanese cities during
World War II. Edward Teller guessed that the energy released in a fire
would exceed that of the nuclear explosion, roughly the equivalent of a
thousand megatons. “This is the most violent and wide-spread
environmental change which can be expected from a nuclear attack,” he
said. If total war were the goal, fires from thermal radiation could
achieve it on a continental scale.
These discussions, recorded for
posterity in NATO meeting minutes, have a surreal feel to them.
Scientists argued about whether hydrogen bombs were more effective as
triggers of vast environmental events, or if they should just be dropped
directly on their targets. Scientists quibbled over the extent of
damage from a fire-raising weapon. Some doubted, for example, that
hurricane-force winds would ensue. It was difficult to argue with the
conclusion, however: “The immediate result would be beyond all
experience.” But some insisted that it would only “likely” be beyond all
experience.
Such intellectualized detachment from human
experience reached new heights when the long-term ecological
consequences of nuclear weapons were imagined. The NATO group recognized
that using nuclear weapons in this way might have severe consequences
for the earth in the long run. But while acknowledging that the effect on
weather and climate might be significant, scientists had little data
with which to generate specific predictions. As for the devastation of
the land, NATO was confident that a succession of vegetation “would
sooner or later re-establish itself, and over a few decades there would
be some ecological recovery.”
The only thing not in doubt in these
discussions was that maximizing human death was the principal goal.
Which was better, Teller and his colleagues asked—drowning villages
along the coast, igniting the countryside with thermal radiation, or
simply laying waste a city? Should humans be contaminated through the
food chain, or beat into submission through ecological dependence? While
praising the ingenuity of these wildcat ideas, Teller’s own preference
was to bomb cities. If death and devastation were the goals, he
reasoned, why not keep it simple? Mammals, including humans, were more
sensitive to radioactivity than insects, seed plants, or bacteria. It
made little sense to attempt to contaminate man through these less
susceptible organisms when the bomb would do the trick. “Thus the most
economic way to attack populations with nuclear radiation,” the
committee concluded, “is to do so directly rather than through some
element of their surroundings.”
For many in NATO, looking at the
world as a zero-sum game between the nuclear-armed United States and the
Soviet Union, environmental warfare seemed like an inefficient sideshow.
As interesting as ocean manipulation and weather control might be,
nuclear explosions would be required to produce them. In that case,
presumably a real war would have begun, and the enemy could be bombed
directly without resorting to exotic methods such as these. Even in the
case of biological, radiological, and chemical weapons, changing the
environment would be a more circuitous route than attacking directly.
In
trying to imagine uses of environmental weapons, military analysts
working with NATO confronted the same question that has stood at the
center of environmental issues ever since: can human actions have
long-lasting, detrimental consequences upon the earth? As an advocate of
peacetime nuclear testing, Teller had reason to minimize the long-term
impacts of human action, particularly nuclear fallout. He spoke at
length to the committee about how some scientists had exaggerated these
effects, and his point of view prevailed.
The NATO committee concluded
that the danger of sickness and disease from contamination “are no worse
than the other hazards which would have to be faced by the survivors of
a nuclear war.” As for the long-term genetic effects upon future
generations, the committee toed the protesting line that the ultimate
effects on future generations could not be predicted with certainty.
Nevertheless,
some on the committee were convinced that humans were capable of making
large alterations to the environment. Throughout the Von Kármán reports
were repeated references to unpredictable consequences of human action
on the atmosphere. Increasing or decreasing the ozone concentration in
the atmosphere was certainly possible, altering the amount of
ultraviolet light reaching the earth. Deliberate creation of an ozone
hole might confuse surveillance systems; deteriorate aircraft materials
such as rubber, plastic, glass; and harm humans and crops. Less
purposeful might be the introduction of chemicals from rocket fuel or
other sources, resulting in “large inadvertent changes” in atmospheric
properties.
NATO concluded its assessment of environmental warfare
with a warning that major changes might already be under way. “Much of
the military planning of today assumes that the earth’s atmosphere will
remain substantially as it is,” it wrote.
Reprinted from
“Arming Mother Nature: The Birth of Catastrophic Environmentalism” with permission from Oxford University Press USA. Copyright © Oxford University Press 2013
No comments:
Post a Comment