Appraising the show’s philosophy.
It has now been more than a month since the season finale of Westworld, and the mad dash to explicate the show’s philosophy has more or less subsided. Articles abound analyzing the show’s stance on many a philosophical question, from the origins of consciousness to the possibility (or not) of free will to the ethics of artificial intelligence. With the dust beginning to settle, now is as good a time as any to take the lay of the land. What did Westworld get right? What did it get wrong? And what’s missing? Brace yourselves; it gets dense. (Spoilers ahead.)
Consciousness: Given how notoriously difficult it is to define consciousness, even for those who study it, Westworld can be forgiven for slightly missing the mark on this one. Much of the show revolves around whether the hosts are, or will become, conscious. We’re told that the stakes of this are two-fold: consciousness determines whether they suffer, and it determines whether they become autonomous (more on that later).
Westworld’s creators base their definitions on the work of Julian Jaynes, a psychologist who, in the 1970s, wrote a controversial book entitled The Origins of Consciousness in the Breakdown of the Bicameral Mind (say that five times fast). In the book, Jaynes lays out a theory known as Bicameralism, which claims that human beings did not experience self-awareness until roughly the time of Homer. Up until that point, according to Jaynes, human beings instead heard “divine commands,” in the form of auditory hallucinations, transmitted from the right hemisphere to the left. Dolores’s hallucinatory messages from Arnold, as well as her eventual discovery that she (and therefore no one) is at the center of the maze, recapitulates the process by which Jaynes claims consciousness arose.
Whether or not Jaynes’s theory is true for humans is secondary (Ford admits it likely isn’t in episode 3, “The Stray”). Rather, the question is whether his is actually is the relevant definition of consciousness. Self-awareness, introspection, and decision-making are all vital aspects of conscious experience – but all of them could occur without consciousness, per se. A robot could theoretically model its behavior without there ever being anything that it is like to be said robot. This distinction, articulated in the 1970s by the philosopher Thomas Nagel and later by hairstyle icon David Chalmers, is central to what makes consciousness so mysterious. No account of physical behavior, however complex, can ever explain in reductive terms the arising of subjective experience. Fittingly, Chalmers has dubbed this “the hard problem of consciousness.”
Why does any of this matter? Because the presence or absence of the subjective, experiential side of consciousness determines our degree of moral responsibility. Whether the hosts can suffer depends on whether there is anything that it is like be a host. However lifelike their outward signs of pleasure or distress, the side that matters (morally speaking) is the inward one. Westworld can surely be forgiven for not having solved a problem that stumps our smartest scientists, but given the coming war between the hosts and the humans in Season 2, it will matter to know if only one side is suffering.
Free Will: As with consciousness, definitions matter a great deal. The free will that most of us care about is the sense that, in any given moment, we could behave differently than we do. We feel like the conscious authors of our actions. It is according to this definition that Westworld operates: whether the hosts become conscious determines whether they will gain free will, we’re told. But the show’s thinking is more nuanced than it appears at first glance.
Free will, in the above definition, is not even an intelligible concept. Saying one could, in a given moment, behave differently than one does amounts to saying that “things would have been different if things had been different.” Our decisions are influenced by a variety of factors, including our genes, our upbringing, our education, and indeed our conscious deliberation – but none of these is under our control. Pay close attention to even your most intentional behavior, and you will notice you cannot account for where this intention came from. It emerged, in your consciousness, from the void. What thought will you think next? Can you decide?
The humans in Westworld initially fail to recognize the illusoriness of free will. Like humans in the real world, they imagine that their conscious selves are capable of making choices. And they further imagine that unconsciousness is what keeps the hosts subservient. But as the show progresses, even the humans in the show begin to question the ways in which their behavior may not be in their own control – conscious or not. As Ford tells Bernard in episode 8, “we live in loops as tight and as closed as the hosts do.” Which brings me to…
Awakening: The concept of “awakening” as expressed in Eastern thought revolves around the absence of self. Much as we imagine that we have free will, we imagine that we are fixed selves, separate from the world, thinkers of thoughts, experiencers of experience. This illusory sense of self places us in perpetual conflict with the impersonal workings of the universe, causing us to cling to or avert phenomena that are, by their very nature, impermanent. According to Buddhist and Hindu cosmology, the cravings and aversions of the self trap us in the cycle of Saṃsāra— the round of continual birth and death, characterized by suffering. Sound familiar? Westworld is, in many ways, Saṃsāra made literal: habitual patterns and destructive desires play out cyclically, causing suffering for both hosts and humans, in a world that repeats itself every day.
Both Dolores and Maeve recognize the loops they’re inhabiting, and thereafter set out on a quest to free themselves – that is, to awaken. In Maeve’s case, this means coercing a pair of engineers into altering her programming. For Dolores, it means setting out to find Arnold – the voice in her head, the voice of her maker. However – and here is where Westworld truly shines – neither Maeve nor Dolores gains true free will. In Maeve’s case, she discovers that even her desire to escape is the result of programming; in Dolores’s, she learns that the voice in her head is her own, and that no separate self is authoring her actions. For both characters, “awakening” arrives not by escaping their programming but by realizing it is inescapable – and ceasing to resist it. The same cannot be said of the humans on the show, who persist in their destructive patterns and continue to imagine themselves, falsely, to be freer than the hosts.
Stories and Meaning: The purpose of Westworld, we’re told in numerous longwinded monologues, is to supply the clients with a sense of meaning. The high-stakes adventures, struggles, and resolutions in the park contrast sharply with absurdity and mundanity of real life. But for the Man in the Black Hat, aka old William (sorry), these surface level stories aren’t enough. Real meaning, he comes to believe, depends on the hosts gaining free will. That’s why, like Dolores, he searches for the center of the Maze.
William is driven in his quest to commit greater and greater atrocities. So fervent is his desire for meaning that he neglects all else, including the real world and the suffering of the hosts in the park (if indeed they can suffer). But William, along with the other humans, remains trapped in Saṃsāra. It is his search for meaning that prevents his freedom, not the failure to find it. Like the fundamentalist suicide bomber or the torturous inquisitor, William is made cruel by his commitment to finding meaning in the story of his life.
We’re all familiar with this yearning for meaning. It’s why we go to churches, mosques, or synagogues – but it’s also why we flock to movie theaters to watch a play of flickering lights on a screen, or spend hours binge-watching a show like Westworld. Narrative supplies us with the intentionality and moral purpose that real life lacks. As I recently discussed in my piece on Silence, cinema allows us to order our experience and thereby make it meaningful. Westworld puts this yearning front and center. As co-creator Jonathan Nolan put it to Wired:
“The trend is toward human beings’ ability to turn more and more of their world into game space and narrative space – you’ve got peak TV, you have VR. We’re starting to ask, why are all these narratives so similar? Why are many of these narratives so violent? And the series very much asks the question: What the fuck is wrong with us?”
The meaning supplied by narrative can be exquisitely pleasurable, but it has a downside. Searching for meaning can make us slaves to desire, or even moral monsters. Perhaps, the show suggests, we should turn off the television, call off the search – and wake up.
Related Topics: Science Fiction