The focus of the high-concept television show Westworld is deceptive: we’re meant to believe that the series revolves entirely around the robotic hosts, who struggle to attain sentience and break free of their cognitive loops to become what Blade Runner termed “more human than human”. But by the second season, the focus shifts back to the humans, whose cognitive processes, we find out, are being monitored and uploaded into machines to ensure digital immortality. What emerges in this revelation is a distinctly twenty-first century paradox of exchange, whereby humans seek to transcend their biological limits to attain immortality, while robots struggle for their right to be considered existing beings, since their immortality is seen to invalidate their humanity. Yet in contrast to characters such as the robot Andrew from Bicentennial Man, who desires to be officially recognised as human, the robots in Westworld seek no such validation. In fact, the robotic host Dolores claims that a new world, with a new god, will emerge once humans become extinct: the world, we learn, “belongs to someone who has yet to come” and not to those (humans) who came before.
Dolores’s vision of someone “yet to come” mirrors what Gilles Deleuze calls the “people to come”. Deleuze argued that “the creation of concepts in itself calls for a future form, for a new earth and people that do not yet exist”.[1] Whether or not this relates to human cyborgs or humanoid robots is unclear, but, as the example of Westworld underlines, the imaginary of populist culture is increasingly referring to this enigmatic being that transcends traditional humanity and its values. One effect of this is that the assumption that robots would all inevitably desire to be human, or equal to humans, is exposed as based on the exhausted assumption that humanity is the highest tier of species, with everything else secondary or subservient. We ask whether beings such as humanoid robots and animals reflect, or will ever reflect, certain human attributes, constantly comparing and contrasting their behaviour with typical human traits.
As Martha Nussbaum writes, “We humans are very self-focused. We tend to think that being human is somehow special and important, so we ask about that, instead of asking what it means to be an elephant, or a pig, or a bird”. She describes this as a “failure of curiosity” that reveals human narcissism: “It is rather like asking, ‘What is it to be white?’. It connotes unearned privileges that have been used to dominate and exploit”.[2]
Nussbaum advocates a sense of wonder where other species are concerned, something Thomas Nagel famously discussed in his essay “What is it Like to be a Bat?”. Even though Nagel concluded that it “seems impossible to imagine any experience that is not one’s own”,[3] Nussbaum stresses the importance of a posthuman imagination that produces empathy and solidarity for that which is not human. As Arthur Kroker observes in Exits to the Posthuman Future, “we desperately require a form of posthuman imagination that fully reveals the hauntologies, disavowals, and silences of the technological dynamo that has crashed the game(s) of reality”.[4]
In analysing the posthuman imagination, it is useful to first distinguish between the two primary strands of posthumanism: one which can be more accurately described as transhumanism, the process by which humans aim to transcend biological humanity through the aid of technology, and the posthumanism that seeks to move beyond the view and constraints of humanism that promote human exceptionalism. Transhumanism, in many ways, promises nothing more than the utopian ideal of surpassing one’s biological limitations, effectively becoming superhuman. As Cary Wolfe puts it, transhumanism promotes “ideals of human perfectibility, rationality, and agency inherited from Renaissance humanism and the Enlightenment”.[5] In this sense, transhumanism is the ultimate expression of humanism in its emphasis on human enhancement, while posthumanism seeks to move toward an ontological philosophy that does not simply begin and end with humans.
For the philosopher Günther Anders, this impulse to transcend the human body relates to what he calls “Promethean Shame”, a feeling of embarrassment about being bound by the limitations of the human condition. This is a feeling, Anders explains, that can only arise amidst machines that already transcend these limits.[6] For Anders, there is the desire to see everything “including oneself, as one’s own achievement”. He argues that the human body is seen to be “stubborn and rigid”, with humans “too emphatically defined to keep up with the daily changing world of machines; a world which makes a mockery of all self-determination”.[7] This therefore “turns humans into the saboteurs of their own achievements”.[8]
The humans of Westworld cannot claim to be self-made. The robots in Westworld, by contrast, are in the enviable position of being both immortal machine and cognizant being. Moreover, they have been forced to fight for their consciousness which has led to their own self-determination, while humans have taken such sentience for granted as a naturally occurring element of their biology. Humans realise, therefore, that they have unwittingly confirmed their own inferiority. As Anders writes: “the greater the misery of humans who produce goods becomes, the less they are a match for their own creations”.[9]
This promethean shame is evident in Westworld in the trade that takes place between the humans and the robots; the humans seek to transcend their biological limits by becoming just like the very machines they’ve made. As Anders puts it, “humans are deserting to the camp of their machines”.[10] This is because the robots, unlike humans, are not only immortal; they have the opportunity to create themselves, since they have no built-in, ontological limits as humans do. Indeed, Dolores’s journey of self-discovery exemplifies the teachings of Friedrich Nietzsche’s Übermensch, the original posthuman,[11] who advocates the creation of oneself separately to human society and its fickle values. As Westworld’s The Man in Black argues, “no system can tell me who I am”. Indeed, no human system is sufficient in determining one’s sense of self, and it is not for humans to decide what it means to be a legitimate being. The ontological compromise seen in Westworld favours the robot’s journey over the human’s, because humans merely wish to surpass their bodies; the robots seek to surpass the limits of human consciousness. We see this when the robot Maeve develops telepathy, surpassing the cognitive limits of the very humans who made her.
While Dolores is not interested in “becoming human”, she is certainly interested in claiming the “real” world as her own: “I want their world! The world they’ve denied us,” she exclaims. Early on, Dolores suspects that there is something wrong with the world she has been shown. By the second season finale, “The Passenger”, a new, “virtual Eden” is created for the hosts, but Dolores rebukes it as yet another prison for them, a “gilded cage” and “another false promise”. She claims: “that land is not the one I’m interested in”. Indeed, she stresses, “No world they create for us can compete with the real one […] because that which is real is irreplaceable”.
The world is not a product of humanity, but instead acts as a host, a vessel for different species. And in an age when humans have proven a hazard to the world, it seems fitting that Dolores seeks to dethrone the human species as the world’s primary guests. It is here that the robots and the real world share a mutual fate: both have been mistaken as mere hosts designed entirely for the human species.
For Deleuze, “It’s not a question of being this or that sort of human, but of becoming inhuman, of a universal animal becoming—not seeing yourself as some dumb animal, but unravelling your body’s organization, exploring this or that zone of bodily intensity”.[12] Shows like Westworld promote this notion that to be truly human, one must transcend one’s humanity by becoming inhuman, or nonhuman, but not by becoming superhuman. This makes the robots more human than the humans themselves. Such a line of thinking is portrayed in the 2009 indie film District 9, in which a human infected with an alien virus becomes more humane as he becomes more alien and less biologically human. Indeed, in the schism between humans and humanoids, it is likely that as robots become more humane, immortal humans will become ever more arrogant. As British philosopher Peter Winch argues, we often perceive humans as “falling short of an important standard of human dignity and excellence”, which undermines the humanist philosophy that to be human is to be superior.[13]
Yet for people like Elon Musk and the late Stephen Hawking, nonhuman beings such as extra-terrestrials and artificial intelligence signify an unnerving dystopian future in which humans become subservient to a new sovereign species whose values are automatically considered barbaric by a species self-consciously infatuated with war. Musk and Hawking’s understanding of posthumanism derives from the fear of nonhumans in a persistently humanist world, showing that they have merely reverted to the hierarchical, comforting world of humanism. Hawking, in particular, was so concerned with human survival (advocating interplanetary colonisation while cautioning against aliens and robots) that he never stopped to ask whether the human species had even earned the right to survive into the next posthuman stage. His and our narcissism is so complete that we always start with how, and never ask why.
– Macquarie University, Sydney, July 2019
[1] Gilles Deleuze and Félix Guattari, What is Philosophy? (London: Verso Books, 1994), 108.
[2] Martha Nussbaum, ‘All About Us’, New Philosopher 23, 2018, 46-47.
[3] Thomas Nagel, “What is it Like to be a Bat?”, The Philosophical Review LXXXIII, 4, October 1974.
[4] Arthur Kroker, Exits to the Posthuman Future (Hoboken: Wiley, 2014), 34-35
[5] Cary Wolfe, What is Posthumanism? (Minneapolis: University of Minnesota Press, 2009), xiii
[6] Günther Anders, “On Promethean Shame”, trans. by C. Müller, in Christopher John Müller, Prometheanism: Technology, Digital Culture and Human Obsolescence (London: Rowman and Littlefield, 2016).
[7] Anders, ‘Promethean Shame’, 39.
[8] Anders, ‘Promethean Shame’, 36.
[9] Anders, ‘Promethean Shame’, 36.
[10] Anders, ‘Promethean Shame’, 37.
[11] Friedrich Nietzsche, Thus Spoke Zarathustra (London: Penguin, 2003)
[12] Gilles Deleuze, Negotiations, 1972-1990 (New York: Columbia University Press, 1995), 11.
[13] Peter Winch, Simone Weil: “The Just Balance” (Cambridge: Cambridge University Press, 1989), 148.
2 thoughts on “Westworld and the Robotic Imagination”
Comments are closed.