MuseLetter #189 / January 2008
by Richard Heinberg
This month, due to extraordinary circumstances, it is not possible to send a new piece of writing. This is a great disappointment to me, and I assure you that, beginning next month, MuseLetter will resume publication of fresh material. Meanwhile, I hope you will enjoy the essay below, one of my favorites from recent years.
Best wishes, Richard
Reprint of MuseLetter #99 / April 2000
by Richard Heinberg
The Future of Technology
This morning I was awakened at 6:30 by a robot; - not a wheeled electronic valet named "Robbie" bringing me orange juice and toast, but an automated fax machine dialing my phone number and beeping expectantly into my answering machine, hoping to provide me with some helpful advertisement. While I won't go so far as to say the experience ruined my day, it nevertheless cost me some sleep and provoked me to reflect somewhat darkly on our species' present and future relationship with technology.
Before I even get started I can hear some readers thinking, "There goes Heinberg again, spreading his paranoid negativity. Why doesn't he ever look on the bright side? I bet he isn't even going to mention all the existing and potential benefits of technology. He's just going to spend the next four thousand words whining about how rotten machines are and how they're going to make our lives even worse in the future. Meanwhile there he sits, using his computer and printer to tell us how computers are ruining us."
Guilty as charged. I'm paranoid, negative, and sometimes hypocritical (at least in my writings on technology; I try to be cheerful and optimistic in my personal life). In my defense I would merely point out that I'm not alone. On my desk sits the April issue of Wired magazine (a publication I don't often read) featuring a lengthy cover article by Bill Joy - cofounder and Chief Scientist of Sun Microsystems - titled "Why the Future Doesn't Need Us," in which the author ruminates on the likelihood that 21st-century technology will cause the extinction of biological humans. The article is not as systematic and clear-headed an analysis of the relationship between society and machines as one might expect from, say, Kirkpatrick Sale, Jeremy Rifkin, Chellis Glendinning, Jerry Mander, or Theodore Roszak; nor does it ring with the sagacity of Lewis Mumford's The Myth of the Machine. But for the Wired audience, Joy comes across as a latter-day, high-tech Jeremiah railing against the blindness of his colleagues in building a monster neither they nor anyone else can control.
The new technologies that Joy finds most likely to change our lives in the 21st century are genetic engineering, nanotechnology, and robotics (GNR). He writes:
[B]ecause of the recent rapid and radical progress in molecular electronics - where individual atoms and molecules replace lithographically drawn transistors - and related nanoscale technologies, we should be able to exceed Moore's law rate of progress for another 30 years. By 2030, we are likely to be able to build machines, in quantity, a million times as powerful as the personal computers of today. . . . As this enormous computing power is combined with the manipulative advances of the physical sciences and the new, deep understandings in genetics, enormous transformative power is being unleashed. These combinations open up the opportunity to completely redesign the world, for better or worse: The replicating and evolving processes that have been confined to the natural world are about to become realms of human endeavor.
Joy discusses inventor Ray Kurzweil's prediction (published in the latter's The Age of Spiritual Machines) that we will gradually replace ourselves with robotic technology, and eventually be able to "download" our consciousnesses into immortal computers. "But if we are downloaded into our technology," asks Joy, "what are the chances that we will thereafter be ourselves or even human?" His fear is that "On this path, our humanity may well be lost."
Joy reminds us that
The nuclear, biological, and chemical (NBC) technologies used in 20th-century weapons of mass destruction were and are largely military, developed in government laboratories. In sharp contrast, the 21st-century GNR technologies have clear commercial uses and are being developed almost exclusively by corporate enterprises. In this age of triumphant commercialism, technology - with science as its handmaiden - is delivering a series of almost magical inventions that are the most phenomenally lucrative ever seen.
However, the new non-military technologies could have disastrous unintended consequences even more awful than the intended effects of 20th-century "NBC" weapons: genetic engineering combined with nanotechnology could lead to selfreplicating robots the size of bacteria - too tough, small, and rapidly spreading to stop - that might show up as a "gray goo" efficiently disassembling and obliterating all life.
Joy freely acknowledges that his own work (which centers on making computer software more reliable) is at least tangentially complicit in these developments. He is disturbed by the direction of events in which he is participating, but not disturbed enough to abandon ship. He advocates more public discussion, more conferences. "In the end," he says, "it is because of our great capacity for caring that I remain optimistic we will confront the dangerous issues now before us." Ironically, perhaps, some of the best writing in the article is contained in a lengthy quote from Ted Kaczynski, the Unabomber - who, despite obvious ethical failings, at least maintained a minimally hypocritical relationship with technology by actually practicing the primitivist, Luddite lifestyle that his writings implicitly preached. Joy, in contrast, is tortured, guilt-ridden, and unable to offer any realistic solution to the potential horrors he illuminates.
The Technological Cornucopia
Joy's article is being discussed widely by the techno-intelligentsia, mostly in dismissive terms - if James Pinkerton's recent syndicated Newsday op-ed piece (titled "Fearful Scientists Let Science Down") is any indication. According to Pinkerton, Joy should spend less time warning about technology's apocalyptic potentialities (which should be "the province of theologians and doomsday cultists") and devote more attention to doing what scientists do best - solving problems. If technology presents humankind with novel dilemmas, says Pinkerton, then it is up to the technologists to give answers.
Okay, let's give the technoids their due. They do solve problems, after all. Just how might 21st-century technologies get us out of our current jams? I have written extensively elsewhere about the perils of biotechnology. But what of its promise? Millions of children in India and Bangladesh suffer from vitamin A deficiency; researchers are now perfecting a genetically engineered strain of rice rich in that micronutrient. Other genetic scientists are hard at work finding cures for genetic diseases like Tay-Sachs syndrome and sickle-cell anemia. Further ahead, it will be possible in principle to solve the problems of world hunger and topsoil loss by engineering bacteria to produce all necessary human nutrients, in limitless quantities, in vats. As the human lifespan is extended by redesigning future generations to be healthier and more disease-resistant, we will at the same time be able to replace organs more easily with spare parts grown in pigs genetically tailored to produce perfect transplants.
If biotechnology offers the hope of life extension, the marriage of biology with computer science points the way toward immortality. While it might be interesting to make clones of, say, Einstein or Mozart, the genetically gifted duplicates would not necessarily develop the same talents as the originals. But suppose we could preserve more than just the genes of future geniuses; what if we could (as Kurzweil puts it) download their thought processes, their very consciousnesses, into hyperpowerful computers? Imagine what the Dalai Lama or physicist Stephen Hawking might achieve if, instead of living a mere 80 years or so, they - or their digitized minds, at any rate - could still be around a millennium from now, adapting to new cultural trends while building on past accomplishments? Imagine a world in which the sorrows of death and loss were things of the past.
Today we face the prospect of eventual shortages of essential materials. With a growing population, there just won't be enough of everything to go around. Absent technology, there are only two options: population reduction (most likely through starvation, war, or pestilence), or a widening of the income gap between rich and poor so that a tiny minority of the human population enjoys the dwindling reserves of resources while everyone else maintains an increasingly tenuous hold on existence. With nanotechnology, microscopic self-replicating robots could synthesize, atom by atom, any substance we need from virtually any raw material whatsoever. Hence no shortages; no more wealth disparities.
Tomorrow's high technology could be much friendlier to the environment than today's. Currently we derive most of our energy from fossil fuels, whose burning releases greenhouse gases into the atmosphere. In the future, bioengineered hyperefficient microorganisms will produce limitless quantities of hydrogen - a fuel whose combustion produces only pure water as a byproduct. At the same time, giant satellites will collect sunlight, convert it to electricity, and beam it down to Earth. Endangered species will be cloned or otherwise genetically preserved until some future time when computer-driven nanobots will have replicated their nowdisappearing habitats. Meanwhile, designer microbes will eat all our toxic wastes. What if these new technologies are incapable of undoing - quickly enough - the damage we've already caused to our planet's life-support systems? Perhaps other technologies could help us make the best of a tragic situation: we (a few of us, at any rate) could colonize other planets, making a fresh start. In these new Edens we would apply the wisdom gained from examining our mistakes back on Earth. Using nanobots, we could terraform even minimally hospitable worlds, manufacturing all the oxygen, water, and food we could ever need. And, when those planets faced threats (the death of our Sun, collisions with comets, more human failures), the population could set out for still newer colonies.
In short, it is difficult to identify a problem for which no technological solution can be imagined.
The Limits to Invention
The key word in the previous sentence is "imagined." In the paragraphs above I mentioned no objections to the technological vision of the Rapture (oddly, while Pinkerton draws a parallel between Bill Joy's technophobic dystopia and religious apocalypticism, he completely ignores the well-demonstrated roots of technoutopianism in the Christian search for redemption in the transformation of nature; see David Noble's The Religion of Technology: The Divinity of Man and the Spirit of Invention, 1998). However, there are many such objections, which center on the observations that (a) most technological "solutions" entail unintended negative consequences; (b) some technologies simply don't work as promised; and (c) we may be incapable of funding the development and implementation of the technology along the lines envisioned.
In 1900, futurists who took the automobile seriously tended to see it as a plaything, or perhaps as a solution to the problem of horse feces clogging city streets (this was indeed a serious predicament in places like New York and Chicago). Those futurists were right - up to a point. The automobile did solve the dung dilemma, but of course it created pollution problems of its own (exhaust fumes, discarded tires, rusting heaps of obsolete dream machines). Think of the unintended consequences of agricultural petrochemicals and asbestos insulation. The robot that woke me up this morning provides yet another illustration: I'm sure the pioneers of robotics saw their work as leading to the reduction of human toil; but "toil" can sometimes imply "rewarding work" or "human accountability." In this instance, I would have preferred hearing a real human being on the other end of the phone line (I was unable to tell the automated fax machine what I thought of it, nor to ask it not to call me again). But at least these inventions did something they were intended to do, even if they did other less-desirable things as well. What about plain technological failures? Biofuels offer a telling example. Their purpose, of course, is to provide an energy source, hopefully a cheap one. Even if we leave aside all discussion about the unintended negative effects of biofuels production, the fact remains that it has utterly flunked its basic assignment. Taking into account the energy costs of growing crops and crushing or distilling them into fuels, they have so far often proven to be a net energy sink rather than a source. We can afford to produce them only in an energy environment dominated by abundant and cheap fossil fuels.
Will the 21st-century technologies work as promised? Will they have disastrous side effects? Surely we Cassandras should be permitted our doubts.
Biotech and nanotech promise to overcome the basic survival constraints of the human species. Their advocates imply that we will no longer be dependent on soil, or conventional sources of water or energy. How realistic are such promises? Suppose we could synthesize a complete human diet out of biochemicals produced by genetically engineered bacteria. Is it likely that humans would thrive on such a diet? Or might the biochemists' understandings of what humans actually need to eat for health maintenance fall short? Recall efforts since 1945 in replacing plant nutrients in topsoil with petrochemicals: we managed to increase yields-per-acre dramatically, but the nutritional quality of our food has faded even as poisons have accumulated in our bodies. Living soil has proven to be irreducibly complex and irreplaceable by synthetic substitutes. Isn't the same likely to be true of traditional foods? How can we adequately replace something we don't fully understand?
The holiest of grails for the new technologists would be a substitute for fossil fuels. Given a cheap, abundant source of concentrated energy, the process of invention can continue almost endlessly - or, at least, until inventions' side effects do us in. Without such an energy source, the invention process will grind nearly to a halt (at least for machines that require lots of energy for their construction or operation). Now, it may indeed be possible in principle to engineer microorganisms to give off hydrogen, and we could theoretically reorganize human society around this new energy source. But there is a problem. The energy investment required to fund that reorganization has to be made in the coin of the current energy source - i.e., fossil fuels. The energy costs of research, development, and retooling our social infrastructure to run on hydrogen will be considerable, and the transition will require a minimum of twenty years - assuming the enthusiastic support of oil companies, other industries, and politicians. To "bootstrap" the transition (funding it primarily with energy derived from the new source) would delay it by several more decades. But we don't have many decades of leeway. With global oil production likely to peak around 2010, there simply won't be enough energy available to fund much longer the existing needs of the social-political-industrial infrastructure of modern civilization, let alone to provide for a wholesale redesign and replacement of that infrastructure. It's true that technology sometimes exceeds our expectations. A few prominent scientists in the 1890s stated that all important inventions and discoveries had already been made. The events of the 20th century proved their judgment laughable. On the other hand, promised technological transformations often fail to occur. Aerospace futurists of the 1960s and '70s universally believed that by the year 2000 humankind would have based permanent colonies on the Moon, and perhaps Mars as well (recall Stanley Kubrick's film 2001: A Space Odyssey). That clearly has not happened; in fact, the likelihood that we will ever station colonies beyond Earth orbit is now in doubt. Why has our expectation of technology been scaled back so radically in this instance? The simple answer is, it costs too much. For the time being, putting a permanent human colony on the Moon costs too much in dollar terms: the politicians who control purse strings cannot be cajoled into lavishing tens of billions on the space program when a lesser amount spent building prisons will generate far more campaign contributions. But soon Moon colonies will cost too much in energy terms: the millions of tons of fuel needed to loft to Luna a dozen people, along with all their needed accouterments, will be unavailable.
In the late 19th century humankind discovered the extraordinary energy bonanza present in fossil fuels. We have used oil to revolutionize transportation, agriculture, medicine, and commerce. Its concentrated power has helped us create a lifestyle in which the typical American now has roughly 300 "energy slaves." It has enabled us to extract resources more quickly and thoroughly than ever before, and to subjugate (through trade, debt, and military intimidation) the whole of what has come to be called the Third World. Its use in agriculture and medicine has led to dramatic increases in human population. It has also permitted us to expand and intensify our means of killing, so that wars have become more horrific than ever.
If, from the beginning of the last century, we had managed this energy subsidy better, we might have used it to fund the development of a modest, nonpolluting, sustainable energy regime that would carry us many centuries into the future. But we didn't. We have spent the past decades guzzling and wasting energy on a scale never before imaginable. Now there isn't enough left to fund a large-scale transition to another, less concentrated energy system; and the political and economic power centers built around the existing profligate energy complex are discouraging even the attempt.
Recently [in 1999] oil prices rose dramatically as the result of a temporary cut in production by the OPEC nations; financial mavens reassured us that there was nothing to worry about because energy now represents only a small part of the overall economy. Information and high tech account for more dollar-flow than does gasoline. The economists forgot that, in the final analysis, it's all energy. Without fuel, computers cannot be manufactured or delivered; biotech scientists cannot even get to work, much less plug in their polymerase chain reaction machines. We will recover from the recent gasoline price hike, but our oil dependency and refusal to conserve make the longer-term picture less rosy.
Can biotechnology make us immune to all diseases, smarter, stronger, more beautiful? Can artificial intelligence render the human brain obsolete? In principle, perhaps. Given a few more decades of cheap, concentrated energy, we might accomplish all sorts of wonderful, frightening things. But there may be limits. What Is Technology For?
These practical limits may finally force us to confront philosophical issues we should have been contemplating all along. Are machines for people to use? Or are people primarily useful as inventors and custodians of machines? These questions seem ridiculous on their face. Of course machines serve people and not the other way around. But is that really true?
To answer, we would first need to think about what humans actually want and need. We require not just food, water, and shelter, but the time and opportunity for rewarding social interactions. For the sake of our psychological health we need to confront challenges that are within our power to overcome. We need to feel a sense of connection with, and belongingness in, the world around us - both the social world and the natural world.
One could well argue (as I have done in other essays) that those basic needs are actually met better in extremely low-tech gatherer-hunter societies than in modern industrial nations. The point is arguable. However, assuming for the moment that at least some basic human needs are not well addressed by the technological society, then what real purpose do the machines serve? Clearly, the high-tech regime suits well the requirements of a complex social system in which a minority of the population seeks to siphon wealth from the majority while capturing a maximum of resources and energy from nature.
Rather than machines serving our needs, it seems that we are often regimented to the requirements of our tools. We schedule our lives around clocks, build our houses to suit our automobiles, and jump to respond to beepers and cell phones. We become increasingly dependent upon the technological system, and in our dependency we surrender incrementally our freedom and spontaneity. Just as our ancestors in the early Neolithic domesticated animals to perform agricultural work, we have, in a profound sense, domesticated ourselves.
Are all tools therefore pernicious? I tend to agree with Mumford, who held that, while every tool changes its user, some do so in ways that are especially pernicious. He distinguished what he called "authoritarian technics" from "democratic technics": the former term refers to machines or systems of machines that cannot be designed, built, or controlled by individuals or small groups acting spontaneously and autonomously, systems that require and reproduce dependency and obedience. The nuclear reactor furnishes a useful example: its construction entails hierarchically organized armies of experts, and the wastes it produces require that future generations organize further teams of experts and armed guards in order to prevent pollution from escaping and to ensure that fissile materials don't fall into the hands of terrorists. By accepting the nuclear reactor, we also accept that our society must perpetually be characterized by secrecy and the domination of an administratorexpert class. By "democratic technics," Mumford meant the tools that any individual or small community could use to directly meet its needs or express its artistic spirit. The "democratic" alternative to the nuclear reactor might be a windmill, which a family or community could use to do heavy work (grinding, pumping, lifting, etc.). Democratic tools often require more skill on the part of the end user (it's harder to build and operate a home-based power system than to flip a switch connected to a nuclear power grid), but entail less dependency and more satisfaction. What Machines Will Survive?
For the next decade or two at least, we are likely to see a bifurcation and polarization in technologies. On one hand, the development of powerful new tools like the ones Bill Joy has written about will continue. On the other, we'll see the emergence of many new "appropriate technologies" - simple, elegant machines that do more with less, while requiring more muscle-power and attention from the user. Already the discerning democratic technician can, with a bit of effort and a few basic tools, build for herself a simple hand-cranked refrigerator, a solar oven, or a composting toilet. Photovoltaic panels currently require fossil fuel-fed factories for their manufacture. It is conceivable that, in the near future, improvements in design could permit construction of small-scale, decentralized, solar-powered, workerowned PV factories.
As appropriate technologies become ever more efficient and affordable, they will offer an increasingly attractive alternative to the mega-tech system on which most of us have become uncomfortably dependent. But optimistic visions of a quiet, peaceful, and happy transition are probably hopelessly unrealistic. The promoters of biotech, nanotech, artificial intelligence, and the current oil-coal-gas regime will not simply fold their tents and go away. In the near future, most people in the industrial world are likely to continue to become ever more addicted to that regime. Some time in the first half of this new century, the lights will begin to fade. The energy for the technological society will begin to run out. What will happen then? That's anybody's guess. No computer can tell us. In all likelihood, those who will already have adopted appropriate technologies will have a leg up. But the high-tech world will not simply disappear over night. Nuclear waste will be with us for millennia. New bioengineered mutant species could continue breeding and alter ecosystems forever. Moreover, the gasoline won't disappear all at once: as the supply begins to dry up, what's left will likely be commandeered by military and police forces, and by wealthy techno-barons. Heaven knows what Dr. Strangelove-like mischief will be unleashed by such characters in the last days of the technological society.
Meanwhile, what can any of us personally do in the cause of techno-sanity? To the degree that we are dependent on the current energy regime, we are all complicit in its depredations. The only solution therefore is to reduce our dependence, and thus our complicity, by conserving energy and making a personal transition to appropriate technologies.
However, this personal solution is by itself insufficient to prevent significant harms that may result from some of the authoritarian technologies now being designed and implemented. More people need to be alerted to the increasingly compromised position in which their dependency is placing them, and to the side effects of the authoritarian megamachine. Thus there is the requirement for activism of all sorts - from opposition to genetically modified foods to journalistic analyses of global trade agreements. Many people who are doing this necessary work will be unable immediately to put much effort into building alternative, off-grid dwellings, and may have to continue using computers and jet transport, at least in modest ways. The range of activities necessary to achieve a minimally disruptive transition to a post-high tech world is such that there is simply no room for purist, holier-than-thou attitudes on the part of specialists in any of the various alternatives movements. The degree of our success will likely hinge on the degree to which we are able to work together and honor one another's contributions.
I will not be here a century from now to see whether events have fulfilled my predictions. My consciousness will not have been downloaded into a robot. My body will not have been gene-enhanced to extend its longevity. Neither will I be pestered by automated telemarketers. Mortality does have its rewards.