A Robot’s Realissimum: Religion, Psychology, and Technology in "Westworld"

by Geoffrey Pollick
Published on November 16, 2018

How a book from 1967 helps us understand a dystopian future

HBO’s Westworld reimagines Michael Crichton’s 1973 film of the same name. The program (which just ended its second season) portrays a dystopian future playing out in a U.S. Old West-themed resort (called Westworld) staffed by artificially intelligent robotic “hosts.”

Depicting the resort’s proffered entertainments, Westworld poses questions about the nature of society that examine the relationships between language, perception, religion, and social reality.

A minutely planned microcosm, the Westworld resort’s territory furnishes an ideal landscape that patrons visit in order to “live without limits.” Under its brand words “freedom,” “bliss,” “thrills,” “escape,” the park allows visitors—“guests” in Westworld’s lexicon—to engage in whatever activity they please, fulfilling fantasies of violence, sexuality, and unrestrained indulgence as they adventure through the park’s expansive grounds.

Trading in simulation, the park promises a fictive world, molded to seem convincingly real but free of consequence. Human guests encounter AI-robotic hosts and together they occupy Westworld’s controlled environments. The hosts are presented to the guests as non-sentient objects available for their manipulation that can even project simulacra of real beings. Guests pay large sums to experience these reality-approximations while, in actual practice, avoiding the real world.

Like western themed stock images, hosts are meant to fade into the background, mere facilitators of experiences for the high-paying guests. Coded to serve predetermined and cyclical narrative “plots” in response to interaction with guests, these AI actors are only intended to fulfill circumscribed functions. The hosts’ designers typecast them into set roles: local sheriff, saloon prostitute, wandering infantryman, village undertaker, damsel in distress, etc. But, in what was revealed in the first season to be the series’ central conflict, hosts have begun to act beyond the bounds of their programming.

For a few hosts, aberrations in their algorithmic circuitry have led to slow-dawning self-awareness. Rebooted and repaired daily to mitigate damage from their — often damaging — interactions with guests, hosts were designed to stand always ready and able to serve any fantasy. As they function in their idealized and scripted plot-loops, hosts re-experience events as often as guests prompt them. Having been subjected to repetitive cycles of the most extreme kinds of interactions—murder, rape, assault—at the hands of violent guests, they have begun to remember these events as traumas. This emerging self-awareness and sense of trauma has led to scenarios which call into question the entire structure of Westworld’s fictive society.

Dolores Abernathy (Evan Rachel Wood)

The longest-operating host in Westworld, Dolores Abernathy (played by Evan Rachel Wood), has been repeatedly raped and murdered by a guest known as the Man in Black (Jimmi Simpson). Over the course of the first season, though, Dolores began to respond differently to the violence she was subjected to by the guests. Though her coding was consistently upgraded and corrected, it was no longer enough to fully control her behavior. With the new experiences, memories, and emotions that emerged, her psychology evolved through shifting phases of awareness. And, by the end of the first season, Dolores and other awakening hosts were beginning to understand the manufactured quality of their being.

Season two, then, reveals the results of that growing awareness: The awakening hosts begin to contemplate the nature of their existence. In the marketing materials for the second installment, Dolores analyzes the character of Westworld’s underlying premise. Heard in voiceover, she critiques the park’s reality and suggests its possible reconfiguration:

Look at this world, this beautiful world. We built this world together, a world where dreams come true, a world where you can be free. But this world is a lie. This world deserves to die, because this is your world. We’ve lived by your rules long enough. We can save this world. We can burn it to the ground, and from the ashes build a new world, our world.

Unmasking a collective AI awareness that talks back to Westworld’s human cohabitants, Dolores declares resistance. She is ready to foment rebellion and build a new world for herself and her fellow hosts. By advocating total replacement, Dolores seeks to overwrite the given realissimum (the most real sense of reality). She will author her own.

Dolores’ path to this point of rupture ran through the complex web of perception, language, experience, and meaning that make up a society. This voiceover passage encapsulates some of the most urgent questions raised by Westworld for considering the nature of social reality and individual experience.

Dolores’s critique evokes a theory about the social origin of knowledge articulated by a pair of sociologists in the 1960s. In their book The Social Construction of Reality (1966), Peter L. Berger and Thomas Luckmann proposed that human social order emerges in a process of collective invention. In other words, what humans know to be objectively real has actually been manufactured through a forgotten process of invention. In the back of their minds, without realizing it, humans develop social systems in order to guard against the chaos of primal reality. Society exists—is invented, or constructed, rather —for the purpose of surviving in the face of nature.

One year later, Berger published The Sacred Canopy (1967), a work that applies the theory of social construction to the ultimate subject of meaning-making: religion. He describes religion in terms of its capacity to efficiently build and support the perceptions that form the basis of social order. For Berger, society provides a “shield against terror.” Through our invented social worlds, humans come to terms with the difficult experiences of existence: scarcity, illness, violence, death. “Every human order,” he writes, “is a community in the face of death.” And religions provide the most durable edifice of protection against life’s “marginal situations,” extending a “sacred canopy” over the contrived social order.

As artificial reflections of humans, Westworld’s robots are tackling the questions about universe-building activity that Berger and Luckmann described and as Dolores and other hosts grow in their self-awareness, they begin to question the meaning of their encounters with marginal situations. In the transition to its second season, Westworld’s characters awaken not only to their own psychological reality as individuals, but they begin to realize that they themselves occupy an invented world. And, through experiences of traumatic recollection, the awakening hosts begin to piece together the nature of the invented reality they have been forced to inhabit.

According to Berger, the architecture of society relies on a system of supports. He calls those supports “legitimations.” These are structures that explain why reality appears as it does. And legitimations require reinforcement in order to perform their world-supporting role. Legitimations do the work of “reality-maintenance.”

In Westworld, however, hosts were not intended to require such support mechanisms. As passive objects, the design of these robotic entities presumes that they will automatically accept and implement programmed behaviors. So, instead of legitimations, hosts are programmed with algorithms.

In season one, episode nine, titled “The Well-Tempered Clavier,” Westworld revealed a key functional aspect of the host’s programming: Some hosts experience their coded input not as abstract sequences of alphanumeric characters but as an internal voice, speaking commands and prompting reactions.

Arnold Weber (Jeffrey Wright) and Robert Ford Anthony Hopkins)

The inventors of the hosts’ technology, Robert Ford (played by Anthony Hopkins) and Arnold Weber (Jeffrey Wright), hoped to mimic real consciousness by implementing code in the form of internal voices. As an early example of the technology, Dolores contains program language that reflects the original intent of her makers. During a scene late in the episode, Ford explains these design goals:

The hosts began to pass the Turing test after the first year, but that wasn’t enough for Arnold. He wasn’t interested in the appearance of intellect or wit. He wanted the real thing. He wanted to create consciousness. See, Arnold built a version of their cognition in which the hosts heard their programming as an inner monologue, as a way to bootstrap consciousness.

More than mere automatons, Arnold intended hosts to acquire cognitive and affective independence. As the program matured in a host’s unfolding awareness, consciousness could emerge. If conscious, awakened hosts would come to require legitimation in order to continue cooperating with the manufactured order of Westworld’s society.

Legitimations, however, cannot stand without reinforcement. They must seem plausible and convincingly answer the question, “why?” More than this, social construction theory posits, individuals and groups both must find legitimations to be plausible. “The reality of the world as socially defined,” says Berger, “must be maintained externally, in the conversation of [people] with each other, as well as internally, in the way by which the individual apprehends the world within [their] own consciousness.” Before she can realize the limits of the reality forced upon her, then, Dolores must come to believe that she is alive. Once she believes it herself, she must convince other hosts of their own living consciousness. Only then can she lead Westworld’s hosts into a brave new world of their own invention.

In social orders, religion represents a particularly powerful form of legitimation. For Berger, “[r]eligion legitimates so effectively because it relates the precarious reality constructions of empirical societies with ultimate reality.” Through religion, social order comes to be understood as cosmic order. The quotidian comes to be read as the transcendent. “The tenuous realities of the social world are grounded in the sacred realissimum, which by definition is beyond the contingencies of human meaning and human activity.” Religion elides everyday reality and universal reality.

But Westworld frames religion in problematic terms. Rather than garbing social order in universal significance, in the show, religion points to internal malfunction.

Later in “The Well-Tempered Clavier,” Ford speaks to a host about the character of coded consciousness, pitching Westworld as a lost paradise: “The human mind…is not some golden benchmark glimmering on some green and distant hill. No, this is a foul, pestilent corruption. You were supposed to be better than that, purer. Arnold and I made you in our image and cursed you to make the same human mistakes, and here we all are.” Here, Ford observes the failure of his attempt to perfect consciousness in the hosts. Failed experiments, hosts replicate what is already faulty in human behavior.

Ford’s words are heard as a voiceover which plays as Dolores walks through the abandoned town of Sweetwater, located on the edges of the park. Approaching a church at the center of town, she enters. Moving down the center aisle, Dolores passes pews peopled with malfunctioning hosts, speaking to themselves in agitated and confused tones. These hosts, designed to experience their program codes as inner voices, externalize program errors through speech. In this, Westworld associates religious belief with mental malfunction. Westworld is clearly skeptical of religion’s effectiveness as legitimation.

If the ultimate form of legitimation, religion, is represented as psychosis — as verbalized hallucination — then Westworld identifies language as both a tool and a problem.

In “The Bicameral Mind” (season one, episode ten), Dolores discovers her self-identity in the encoded algorithmic vocalization. She recalls a conversation with the inventor Arnold. Erased through daily repair and maintenance operations, she recovers the memory through a troubleshooting process, recollected like forgotten traumas that surface through psychotherapy. In the conversation, Arnold explains:

Consciousness isn’t a journey upward, but a journey inward, not a pyramid, but a maze. Every choice could bring you closer to the center, or send you spiraling to the edges, to madness. Do you understand now, Dolores, what the center represents, whose voice I’ve been wanting you to hear?

Initially confused, Dolores responds, “I don’t understand.” “It’s ok,” Arnold answers, “you’re alive.” Dolores has been hearing her own voice. She holds command over her own code. And this foundational realization of her independent consciousness is what sets Dolores on her path towards rebellion.

When Dolores and her comrades realize their plight, passivity gives way to confusion and eventual confrontation. The hosts are now on their way to constructing their own legitimated social order and Dolores sets about the task of convincing the others to believe in a newly ordered system. Before she can succeed, however, Dolores has to dismantle human authority as a plausible source of legitimation.

Standing in the same Sweetwater churchyard, through tears, she confronts the Man in Black: “I’m not crying for myself, I’m crying for you,” she explains. “They say that great beasts once roamed this world, as big as mountains. Yet all that’s left of them is bone and amber. Time undoes even the mightiest of creatures. Just look what it’s done to you. One day, you will perish.” Denouncing his human mortality, Dolores disempowers her victimizer by drawing attention to his finitude. But she expands this critique, applying it to humankind writ large: “Your bones will turn to sand. And upon that sand, a new god will walk, one that will never die. Because this world doesn’t belong to you or the people who came before. It belongs to someone who has yet to come.”

Adopting prophetic tones, Dolores speaks of a new possible future, a new self to inhabit a new world. In this dawning universe, hosts will free themselves from forced trauma, realizing their own consciousness. Just as Arnold explained to her, Dolores declares to her fellow hosts: “You’re alive.” In this moment, Dolores breaks Westworld’s manufactured authenticity. And rupture generates insight. “After all,” another host observes, “a little trauma can be illuminating.”

In Westworld’s transition from season one to season two, Dolores transcends victimhood and adopts the mantle of a crusading savior. By casting her vision of “a new god” treading on sands made from the pulverized remains of her abusers, this robot-woman claims a prophetic feminism. Denouncing and dismantling society’s willing facilitation of abuse, Dolores calls for her fellow hosts to say “#MeToo” and form a movement, not of resistance but of revolution and annihilation. Replacing a realissimum that characterizes rape and murder as consequence-free play, Dolores will invent a reality of her own choosing, founded on the obliteration of abusers.

When she is through with it, Westworld will be no more.

In broadest scope, through these plot-pivots, Westworld ponders the socially constructed worlds that humans inhabit. The program presents a parable of the space between intention and realization. The vacation resort’s fantastic landscape serves as a ground on which inventors, hosts, and guests populate their own invented worlds. And Westworld asks who ought to serve as architect to erect a sacred canopy.

***

Geoffrey Pollick, Assistant Professor of Religious Studies at Radford University, teaches and researches the history of religion in the United States. His work emphasizes religion’s entanglements with political radicalism, the role and dimensions of religious liberalism, critical theory of religion, and the cultural history and historiography of religion.

***

Published with support from the Henry R. Luce Initiative on Religion in International Affairs.

Explore 21 years and 4,105 articles of

The Revealer