By Vivian Kane | TV | October 3, 2016 |
By Vivian Kane | TV | October 3, 2016 |
This post includes spoilers and plot discussion for episode one of Westworld. Turn back now, ye who haven’t yet watched.
With all respect to our Prime Overlord, whose opinion differed, I thought this weekend’s long-awaited Westworld premiere was every bit as fantastic as we’d been hoping it would be. I’m very much looking forward to seeing where the season (and the next few already planned out seasons) take us, since the pilot did feel— never having seen the original 1973 film, myself— more like the first half of a movie than the start of a television series.
While the pilot’s plot seems like it should be wrapped up in about as much time as the Simpson’s trip to Itchy & Scratchy Land (which my young self never had any clue had this early Crichton source material), the real essence of the show is ideas, not actions. Yes, we want to know who is and isn’t a host (James Marsden, you had us going for a while there!), but if this show is to last a season, let alone five, it seems likely that the real focus will be on hosts (fancy word for robots) discovering themselves, and the world.
The pilot saw a number of hosts breaking free of their programmed “loops,” and shutting down, or acting out. But Peter Abernathy, father of Dolores, manifested his glitch in the form of questions. In addition to suddenly recalling a slew of Shakespeare quotes (The Tempest, a few from King Lear, Henry IV, and Romeo and Juliet, by my count), he tells his daughter he’s been asking himself “the question you’re not supposed to ask.”
So what is that question?
Presumably, it’s the first question in Bernard Lowe’s (Jeffrey Wright) opening scene test, “Have you ever questioned the nature of your reality,” or some variation on that theme. Namely, the hosts are starting to wonder if they’re hosts. They are NOT supposed to ask that question.
Because even beyond what that question means for them (they’ll start to act outside of their programming, disrupt the gameplay, develop emotions, and struggle with sense of self), it makes it necessary for us, as Westworld visitor proxies, to ask the question that they (the fictitious runners of the park) DEFINITELY don’t want asked:
Is Westworld ethical?
This is the question necessary of all AI interactions, at least in film and television. Because, logically, if the hosts of Westworld aren’t asking these questions, the rape and murder that the park is based on is no worse than paying a disgusting amount of money to blow up an empty building, or even smash a mirror or tear a page out of a book. They look like people and talk like people (and in this show’s case, fuck like people), but they are totally inanimate. They can even do those “minor improvisations” outside of their designated scripts and still not be anything close to human.
There is, for sure, a moral gray area. But it’s in regard to the visitors. Just because they’re not raping and murdering humans does not mean that it’s ethical to set up a zone for wealthy psychopaths to get their Dexter instincts out by performing these acts in a condoned, yet entirely realistic manner. It’s undeniably disgusting to want to feel like you are raping or murdering a human, but the questionable morality is on the perpetrators. Those being perpetrated against are basically computers.
Until they start to ask questions, that is. Because while asking if they’re human doesn’t make them human, it does make them something more than machine. In the Turing test— the most famous gauge for determining true artificial intelligence— the onus of “passing” was on the human, if he could tell if the AI in question was machine or not. In Westworld, the humans aren’t asking these questions. Or at least, the visitors aren’t. It’s yet to be seen if the creators will focus on these questions.
For now, these questions are left for us, the audience. I’m not sure where the show is headed from here, but Westworld doesn’t seem like the type to give us answers before they they lay out a whole array of new questions.