Today's date:
  Fall 2004

The Beginning of a New History

Francis Fukuyama, the American political scientist, strategist and philosopher, is best known as the author of the seminal post-Cold War book, The End of History and the Last Man.

Jacques Attali, the French futurist and founding president of the European Bank for Reconstruction and Development, now heads PlaNet Finance in Paris.

NPQ editor Nathan Gardels asked both men to respond to the same set of questions about Bill Joy's thesis.

This exchange appeared in both NPQ and LeMonde des Debats in Paris in 2000.

NPQ | Bill Joy, the chief scientist at Sun Microsystems and one of the leading technologists of the Internet revolution, has become alarmed that rapid and combined advances in robotics, genetics and nanotech (micromachines) could end up giving runaway technology the upper hand over the human species.

Joy questions whether "the future will need us," which seems similar to the idea of a "post-human history." Are your images of the future the same as Joy's?

Francis Fukuyama | The term "post-human" history for me really has to do with the question of human nature. In this sense, biotechnology is in a different category from nanotechnology and robotics. It will mean more fundamental changes in the way we ourselves are rather than how changes in our external environment may harm us.

A self-replicating robot will not affect human nature in a way qualitatively different from the way we are now affected by the threat of nuclear weapons.

In other words, the threats Joy talks about are basically threats to the body--viruses, computer or biological, robots that reproduce themselves and threaten human control. That, of course, is something to take seriously.

But the challenge posed by biotech--and by that I mean everything, including recombinant DNA, that flows out of the human genome project--is an alteration on the level of the soul. And these changes might be so subtle that it could be a long time before we know what we've done to ourselves. Who knows in advance what effect intervening in the complex interaction among our vast array of genes will produce?

It may help to look at the issue from an historical perspective. The period from the French Revolution through the end of the Cold War saw the rise of different doctrines that hoped to overcome the limits of human nature through the creation of a new kind of human being, one that would not be subject to the prejudices and limitations of the past.

The collapse of those experiments by the end of the 20th century demonstrated the limits of social engineering and endorsed, in my view, a liberal, market-based order grounded in human nature. That is what I meant by "the end of history" in the Hegelian-Marxist sense of the progressive evolution of human political and economic institutions.

It could be, though, that the tools of 20th-century social engineering--from early childhood socialization to psychoanalysis to agitprop and labor camps--were just too crude to alter the natural substratum of human behavior.

In this century, however, the open-ended character of the biotech and life sciences revolution suggests we now may have the tools to accomplish what social engineers failed to do in the past. Human nature would thus be transformed, and we would be embarking on a new kind of history.

The question of "post-human history" is far more fundamental than the concerns Joy addresses. It has to do with the basic human repertoire of emotions, cognitive capabilities and even longevity of life. This represents a vast scaling up of the possibilities of technological manipulation that humanity has heretofore not encountered.

Jacques Attali | The discovery of new technologies is like the discovery of new continents: We may one day approach a moment where the new world will take control of the old one. Bill Joy's question about whether the combined and rapid advances in genetics, robotics and nanotechnology will lead to their dominance over the human species is thus an appropriate one.

Queen Christina of Sweden once challenged Descartes' proposition that man is nothing more than a machine by saying: " I never saw my clock making babies." But, in the not too distant future, we will indeed witness the cloning of robots as well as the cloning of men. And both have in common something essential: the bypassing of sexuality.

In the end, it is this artificial replication that unites the new technologies. Indeed, the hate of sexuality is one of the main engines of technological progress. The whole point is to obtain everything without human intervention, by eliminating touch, direct contact, the human interface. Sexuality, after all, is associated with death as the other name of life. To self-replicate through cloning in the lab is to conquer death; the self-replicating robot, many believe, can reach eternity.

Yet, we can not dispense with sexuality if we are also going to transmit memory and conscience to the person. It is sexuality that makes the individual.

As long as science is unable to get beyond this obstacle, the danger of self-replicating robots (which may well grow diverse through random mistakes in programming) dominating the human species will be weak.

NPQ | Controversially, Joy has said we must take seriously the terrorist Unabomber's (Ted Kaczynski) thesis that we are imperceptibly drifting into a dependence on our machines to the point where they will control us instead of vice-versa. He also finds merit in the Unabomber's fear that the only alternative scenario is the rise of new "elites" who will "domesticate" the masses like animals in order to control the dangerous effects of "knowledge-enabled" technologies available to everyone.

For Joy, this is the main conundrum: The openness and democracy of our liberal societies that gave rise to the information revolution in the first place will empower small groups and extremists to employ "knowledge-enabled" technologies in undemocratic, destructive ways. Simpler to use than deter, they favor attack over defense.

Witness the "love bug," or before that the 15-year-old Canadian boy who disabled CNN-On Line with a self-replicating virus sent from his bedroom desktop computer. Or the still unknown hacker who shut down the supersecret US National Security Agency for several days in 2000.

How do we cope with this conundrum of the liberal information age?

Fukuyama | There can be no question that we are all really in trouble when technologies are as easy to use and to own as they are dangerous. We were lucky that nuclear weapons turned out to be very difficult to manufacture, something only capable nation states have been able to do so far.

If a nuclear bomb could be whipped up in the attic or basement, some nut would surely have done it. So, if Joy is right about the dangerous capacities of "self-replicating," "knowledge-enabled" (and thus democratic and widely available) technologies, then he is right to be worried.

Certainly, a biologically engineered germ that could wipe out 10,000 people would cross a threshold far beyond what we have been used to with small terrorist bomb blasts; so would a computer virus that caused, say, the Social Security data bank to be erased.

Joy's thinking, though, tends to make a straight line prediction about technology itself. So far, the dastardly use of dangerous technologies turns out to be limited in some way. People have been speculating about biological weapons falling into the wrong hands for decades now. In theory, a lot of damage can be done. In practice, biological agents are very difficult to handle without contaminating those who want to use them against others. This has slowed down their use for terrorism significantly. Ultimately, the Aum Shinrikyo cult, which released sarin gas in the Tokyo subway, was unsuccessful in doing the damage they intended.

Secondly, Joy abstracts away the political institutions that will necessarily grow up as countermeasures to anything as dangerous as he projects.

The real question, therefore, is not a technological one, but a political-institutional one. What is the likely interplay between those who want to control the likely consequences of technology and those who will try to evade such controls?

Attali | The planetary spread of computer viruses across the Internet demonstrates the fragilities of networks. We are at the beginning of a new kind of war between sedentarians and nomads. Sedentarians will use all means to kill real and virtual travelers, as we saw in the Philippines for kidnapped tourists from Malaysia as well as for the "love bug" virus.

There will be more episodes to come in a gigantic carnival of terrorists, some who will hide behind the most beautiful ethics or values.

To counter this we need an ethic of the new nomadic age of travel, both to monitor tourism and to manage the travel of messages across the Net. But an ethic means nothing without a police to control its implementation.

There will be no Network age without adequate instruments at a world level. We need a world police, and it cannot be only an American one, to control rogue states or, mainly, rogue non-state groups and individuals who will be the pirates along the new routes of the future.

The most surprising consequence of new technologies is the need for a new, efficient world police. If not, the new technologies will become the instruments of private police. The real danger is not one Big Brother, but a host of smaller private big brothers. In short, we have to urgently face the question of a world government.

NPQ | If the democratic access and use of a technology is so dangerous, is the only answer for elites to control it?

Fukuyama | If any technology is so dangerous that its possession by one crackpot can cause massive damage, then there will likely be a democratic consensus for control.

If not, one can only envision the breakdown of society into a state of nature where people employ horrible ways to survive and get back at each other. I don't think we are headed that way yet.

Attali | Yes. The danger otherwise is that the control of technology is left to the scientific experts or no one.

NPQ | One idea Joy raises is "to relinquish the pursuit of knowledge and development of those technologies so dangerous that we judge it better if they are never available." Is that really viable?

Fukuyama | Now that we've gotten on this technological escalator it is extremely difficult to renounce science, beginning with the scientists themselves. In my experience, any suggestion to scientists that society may have broader purposes in wanting to slow down or stop technological progress is usually met with a wall of incomprehension. Among scientists there is a general assumption dating back to Francis Bacon that scientific progress is for the better of all mankind.

Perhaps the time is coming, thanks to arguments by scientists like Bill Joy, where that assumption can be questioned in a serious way. Certainly, there is no prima facie reason that more scientific progress is, automatically, always best.

Again, due to the specific character of nuclear weapons, we have managed to slow down the proliferation process--at least keeping it in the hands of nation states--through diplomatic and institutional means as well as deterrent strategies that were designed for that purpose.

Attali | Nobody will renounce science, but it will be possible to orient it for the best of mankind. Why not create an equivalent to the Pugwash Conference convened during the Cold War by scientists and public figures seeking to avoid nuclear war? Similarly, the new movement would generate an awareness about the perils and promises of the future of science.

Mankind has now the means to commit species suicide. That deserves some attention. Yet, my fight against nuclear proliferation has taught me that people are not in the mood today to be worried by very long term threats. They are much too focused on survival in the short term.

NPQ | If the motor that drove "History" forward according to Hegel and Marx was the contradiction between human freedom and necessity, perhaps the motor of "post-human history" is the conflict between freedom and those technologies we've created to overcome necessity, especially biotechnology? It will be the struggle to realize the promise of the Genome Age, such as regenerative medicine, while preserving dignity, individuality and freedom.

Fukuyama | Yes, I think so. And the struggle will come in many forms. For example, in democratic societies we accept a degree of inequality given to us by nature. Our institutions thus tend to be based on merit and equality of opportunity, not result, because we have assumed we must deal with the biological set of cards we are dealt.

In the future, this may no longer be a sound assumption because our biological makeup can be reengineered. When that becomes a public issue, it will totally reshape politics because such issues are so central to people's moral concerns. Certainly, this set of conflicts presented to us by modern science will be the stuff of a new history.

Attali | The motor of history will be the contradiction between selfishness, as embodied in the quest for freedom and equality, and altruism, as embodied in the quest for brotherhood.

As we enter the age of networks people will have to care for others because everyone's interests and happiness will be linked to that of others.