In The Techno-Human Condition, Allenby and Sarewitz confront the growth of transhumanism as a movement and the history of humans engaging with technics that have shaped the species’ evolution. They additionally issue a particular critique of the Enlightenment. The limits of reason in complex, global technosystems is deeply explored and effectively trounced upon. In reading this text, one might be inspired to recall Horkheimer and Adorno’s The Dialectic of Enlightenment. However, in such a comparison, we might realize a primary weakness at the heart of Allenby and Sarewitz’s project. Horkheimer and Adorno found in the Enlightenment not simply the limits or reason, but a commitment to domination of nature, which extends to the complete domination of humankind. The pinnacle of the Enlightenment is not found in transhumanism (as in Allenby and Sarewitz), but in the concentration camp. If we might update this scenario, we could say the pinnacle of the Enlightenment is perpetual war for oil fought by drone planes and cyborg soldiers in which millions of civilians are murdered (only on the enemy side, of course).
Central to The Techno-Human Condition are two distinct errors. The first is in “the condition” itself, in which both the degree and timing of technical development is compressed. To Allenby and Sarewitz, manipulation of human genes is little different than the forming of a flint arrowhead. The building of a railroad happened centuries ago, and because of that, “we” have always compressed time and space. This, for them, is evidence that humans have always been this way, and it is merely part of our “condition” to move forward to engineer human evolution. Complexity is difficult for them; it’s all just a big mess that is often impossible to truly unravel. And for this reason, such compressions must seem acceptable. Not for this reader. This condition is part of a distinct historical project, one that has been successfully identified and critiqued by Lewis Mumford and Jacques Ellul. However, Allenby and Sarewitz write that Mumford and Ellul (as well as Langdon Winner) simply throw up their hands in exasperation at what has been built. This is a misreading of these authors – a selective one. Allenby and Sarewitz are committed to post-industrial civilization and its megatechnics, whereas Mumford, Ellul and Winner – not merely throwing up their hands – find it to be irredeemable in its monstrous totalitarianism. So, if you accept Mumford’s and Ellul’s history of technics and Winner’s politics of technology, but you also want to preserve the benefits of this civilization, you would simply have to throw up your hands. But if you want a world with less technological domination, more human agency and autonomy, then there’s some work to be done. I’ll return to this issue shortly.
The second error relates to their reliance upon levels of technical systems, where Level I represents the “shop floor” technology that is portrayed as a relatively closed system. This system is simple to understand and measure in cause and effect relationships. This is where Horkheimer and Adorno can help us. For them, these mere technologies are much more complicated, for they rely upon materials and labor that require domination. Allenby and Sarewitz portray this, having written out materials and human labor, as magical. The creation of the technology, its maintenance and reproduction are locked into systems of domination. Perhaps I’m being unfair to Allenby and Sarewitz, for they recognize “problems of wicked complexity” (109-112). It is here where the first and second problems collide. For central to the “problems of wicked complexity” is the competition of diverse, often opposed values. Is this an immutable part of the human condition? Here these two problems explode into a range of other even messier ones. But Allenby and Sarewitz like messes.
Off the list of the many considerations at Level I are two that I consider most important. Both have to do with the allocation of resources – time and material. In a peak everything world, and in societies in which we have made integral to social reproduction a reliance upon increasingly scarce resources, material scarcity exacerbates the issue of time. Technologically complex societies must grapple with the issue of scarcity – of materials and of the time to deal with their depletion. All of us living in technologically “advanced” societies might acknowledge that no one has the time to play around with uploading consciousnesses onto servers that may likely lack the electricity to power them in a couple generations. When some of our “best minds” (as Allenby and Sarewitz would describe them) are playing these games, they are making a likely fatal gamble. Perhaps in a world where resources were infinite, each of us might chose ‘yea’ or ‘nea’ on the transhumanist question. It’s not a matter of choice – like the cochlear implant – where some chose to neglect Level I techno-fixes to demand Level II political and ethical changes. If one accepts that we all are living in a world that may likely experience a billion or more deaths even if humanity pours all of our scientific and technical knowledge into solving issues related to energy production and carbon emission reduction, the diversion of scientists to deal with rich technophiles’ fears of death and aging – and not by treating it as a psychosis, mind you – is perhaps bordering on evil.
In their discussion of Earth-level matrices of interconnected technics and organization, what Allenby and Sarewitz call Level III systems, they disavow universalism, and particularly an entrenched and inflexible variety. Yet their evaluation of the premises in the first two chapters is predicated on universalist ethics. Multiple objectivities are not an option of consideration. Since there is confusion among the entire human species over “what it means to be human,” any assertion of such a meaning is discarded as potentially totalitarian. There are two issues here. The first is the closing off of the possibility of two different communities coming to differing consensus that their accepted meaning of humanity excludes things beyond a certain threshold of manipulation of the body or human genetics. If all of humanity cannot agree, there is no good answer. The problem, as they see it is, Who gets to decide what it means to be human and to preserve that standard? (23). If a group of people decides integral elements of the society are untenable, what are they to do if other groups demand their preservation? This leads to the second issue. If there are competing standards and values, everyone is expected to give up central elements of their cosmological, metaphysical or theological views – except, of course, those who are coming along for the ride on the transhumanist technological juggernaut that will inevitably plow forward, according to Allenby and Sarewitz. Because it is part of our “condition” to move toward more complexity and bigger technics, intervening would not only infringe upon the rights of others, but upon our very nature! This is a most extreme form of determinism, because it is not just our technical infrastructure that determines the path forward, but who we are in our essence – this isn’t whig history, but whig genetics!
There is no techno-human “condition.” Humanity, at present, is locked into a web built through a relatively short history of decisions to view humans as outside of nature, compelled to master it through any means. We haven’t a condition, but a legacy. The transhumanists have simply acknowledged how little we have mastered, and in fearing their decrepitude and eventual death, they seek to do to the human body what this species has done to our external nature in a matter of a very short time. We all might learn from this very real history the lessons to not only challenge the transhumanist project, but also the condition in which we find ourselves today. In the attempt to master nature, humans have only built new systems for domination yet mastered little – our progenitors have multiplied our masters.