The hosts are not real


We speak the right words, and create life—out of chaos,” Ford says. “You can’t play without being acquainted with the devil.

The mind of a human is complicated; it’s filled with cells, nerves, connectors and concepts less tangible: memories and stories we tell ourselves over and over. Those layered parts make up a human mind. When a robot is made in our image, programmed to have similar characteristics, there’s always a final piece: consciousness, that sense of awareness that is either a result of those layered parts or the initial piece of constructing a sentient being, one who has the ability to distinguish between the different levels of morality on its own. Philosophers throughout history have struggled to define the nature of consciousness. If indeed consciousness is a mechanistic output, then it’s possible AI can mimic this output, even in its basic form.

The engineers at Westworld, essentially built computers in the form of humans, from scratch, coding their individual narratives.  At the helm is the pragmatic Ford, the AI creator at Westword, a western theme-park. He understand that controlling the AI (hosts) while also giving them a natural quality warrants some risk, but believes he can control them. When he explains his reasoning for applying reveries to the hosts, he tells Bernard, “You’re a product of a mistake…evolution creates mistakes.” The creator insists that guests come back because of the subtleties of these characters and their interesting storylines. Guests want to be the first to discover those subtleties, and possibly fall in love with those characteristics. Guests may know who they are, but they go to Westworld to get a glimpse of who they could be in an alternative world that’s filled with danger. Some hosts seem happy being idle and picking easy characters to interact with, others like the man in black, need to find purpose and control in a game that is meant to trick new visitors.

Though we keep hearing these artificial creatures are not real, they are too lifelike for us not to become enchanted as the guests do, and we end up rooting for them to escape. What’s scary is how often science fiction predicts future technologies. Realistically we’re not far from creating this type of AI.  But the questions gnawing robotic engineers has always been how to create artificial consciousness. Westworld seems to go further by adding that if true artificial consciousness was possible it would be a cause for alarm, especially in a world where the hosts are not supposed to be free, but subservient to the guests (the real humans).

Dolores, one of the oldest AI (host) from Westworld.


We see the hosts through the eyes of the guests, but also the lab workers who interview them and update them with their touch screen pads. The hosts and lab workers don’t run parallel lives, since the background of the hosts can be altered during review sessions, making it possible to erase their memories. The hosts consider morality, but their deaths are not final, they simply get fixed after being “killed.” The only real death is decommission.

Westworld dives not only into the realm of humans building AI, but the implied ethical questions, especially when the hosts go rogue, disobeying their narrative boundaries. Westworld is a made up world, yet everything seems so real. What does it mean to be human? Do AI have rights? Do they have free will? Do the guests have free will? They too are control by the boundaries of the park. Finding control within the game is an illusion, because the real decisions come from the creators, the masters of code.

As proposed by Hegel, it’s evident that the timeline of history shows us there’s a movement towards freedom, true freedom. Individual have come to understand and master the world of illusions we created for ourselves or others. He believed that humans created God, to pretend that God created us so someone could be responsible for us. This may not be a mistake necessarily, but it demonstrates we are the creators of our own narrative, no matter how deluded it turns out. We turned the idea of God into a convenient story, filled it with commandments and rules, that we sometimes don’t understand or obey. We are the slaves of our own deluded fiction, and Ford is no different. He says, “Self-delusion is a gift of natural selection as well.”

In one of the earlier episodes we find out about the original creator, Arnold, who had devised a pyramid to explain the AI’s internal workings: memory, improvisation, self-interest, but the top was left blank. Ford explains, “He had a theory, based on the Bicameral mind. The idea that primitive man believed the voice they heard was the word of god. It was the blueprint for building a cognition mind. Their own voice would later take over.”

Ford values the art of creating interesting storylines, and indulges in watching the AI integrate a nuanced narrative. This behavior pattern of Ford, foreshadows the final moments of the first season. In the last episode, Westword seems to be driving this point home: the real god is the human mind. If the second season carries the theme of chaos, it seems evident we will witness most of the hosts making independent decisions apart from the creators, a true slip of evolution’s leash.




Ex Machina’s sensitive robot: More human than us?

Alicia Vikander plays the female robot, Ava.
Alicia Vikander plays a  female robot, named Ava.

Ex Machina’s femme robot, Ava, is captivating and thoughtful. Could it be, she’s more human than we are?

At the onset of the film we meet Caleb, a young coder, somewhat naive though witty. He uses the language of a seasoned programmer to form elaborate conclusions yet ignores basic human instinct. After winning a competition he gets flown to a secluded research facility in the middle of a forest. His sole job—give the Turing Test to a functional humanoid with Artificial Intelligence, named Ava.

The creator of this humanoid is Nathan, a straight forward, CEO-genius-programmer, with an appetite for alcohol and secrecy. He’s not interested in the technicality of the test like his guest, Caleb, instead tries to discover human frailties. Nathan has carefully orchestrated a plan for Caleb, but on the surface calls himself “a friend.” The interactions between Caleb and Nathan are amusing to watch, since one is without malice, functioning structurally, and the other is functioning instinctively as a snake.

Upon meeting Ava, Caleb — and I imagine everyone else in the audience, was pleasantly captivated. Ava has the ethereal face of a young girl, skin on her hands and feet but the rest of her body is that of a cyborg. She moves delicately across with soft agility and purpose. As she turns or walks her interior wires make noises . The fluidity of her words are alluring, chosen carefully for the viewer. I get the feeling she’s just toying with us all.


The concept of Alex Garland’ s film is thrilling. Watching a human investigate a robot and vice versa is transcendent, similarly the questions that ensue between Caleb and Nathan. Does Ava know she’s playing chess or is she just following the rules? Does she have feelings? Do you need human interaction to determine consciousness?

It forces us to question our own understanding of what it means to be human: a kind of study into our own psyche within the walls of a futuristic research facility controlled by buttons and a central computer. It’s one of those eerie and claustrophobic places you can’t freely move from room to room.

Nathan showing Caleb “the brain” of a robot. Photo:

Garland’s film is not without misses. Formulaic dialogue runs rampant in the beginning. Though some reviews claim  the whole film falls into a predictable laundry bin. I wouldn’t go that far. The tale of a single man meeting a sexy femme bot is a gimmick at face value, and problematic for an accurate Turing Test, since the tester has a clear bias.  The dynamic between Caleb and Ava is tricky, often running into irksome territory.  There are other scenes in this laundry bin, but I leave those for you to decipher (since I don’t want spoil all the fun).

As the movie progresses it becomes suspenseful, gaining speed once Nathan reveals the outer edges of his mind, as well as Ava’s. The film overall is studied, analytical and profound at times. I wonder if this movie isn’t so much a test for the robot, but for the humans involved in performing the test. We are walking around with consciousness after all, and it’s our responsibility to curve our unruly emotions and be objective.

Walking out of the theater after watching Ex Machina, I had the kind of thoughts you get when looking in front a mirror. Who am I? A robot with fake skin walking around, or a real person with feelings and morals?