10 Things Westworld Gets Wrong about Coding

I thoroughly enjoyed watching HBO's West World. Now that Season 1 is over, I won't spoil anything if I share what's wrong with the plot, and particularly what it says about programmers.
SPOILER ALERT - if you haven't watched all 10 episodes, stop reading now.

So here are 10 things Westworld gets wrong:

  1. "We Don't Know How the Hosts Work" - the nonsense of this is obvious to anyone who knows how to code. In the fictional story, 40 years ago Arnold wrote half the code for the hosts ("the most elegant half"), and then died without explaining his work to anyone. If that were the case, the hosts would breakdown and be irreparable in a lot less than 40 years. In the real world, no one gets away with not understanding, documenting and explaining their code to others.
  2. Hosts can choose to ignore their programming - the writers have a romantic notion that the hosts can "become conscious" and then choose to ignore their programming. In the real world, computers execute programs without "choice". Complex systems can sometimes create the illusion of choice, but there's nothing and no one in a program to "choose".
  3. "We can't define consciousness because consciousness does not exist." That's one school of thought, but it's a point of view that defies everyday experience. It's not "right," it's a current hot topic of debate in scientific and philosophical discourse.
  4. We make the hosts hear their programming as inner dialog and make them suffer to bootstrap consciousness. Also self-contradictory anthropomorphic nonsense. If there is a "you" to hear "your programming" as an inner dialogue, you're already "conscious". If you can suffer, you're already conscious. The hosts (and we) can't prove that they're more than intelligent-appearing zombies.
  5. More self-contradiction: In Episode 6, we learn that Maeve has "bulk apperception" set to 14 (tops allowed for hosts) because she's in a management position. We further learn that the dial goes higher because hosts "have more processing power than humans." The term "apperception" originates with Leibnitz' critique of Descartes and was further developed by Kant. "Apperception" refers to ‘introspective self-consciousness’. So why do hosts have to suffer again? Apparently, they have consciousness to begin with and we can just dial it up at will.
  6. More processing power = more consciousness. This is an equation that Kurzweil and others confidently make, even though there is no evidence, much less proof, for it. In other words, this may be the prejudice of an intellectual elite, but that doesn't make it true.
  7. The hosts have far more processing power than humans. Kurzweil and others also think we're really close to the time when computers exceed human intelligence aka processing power. They'd predict that 50 years from now, computerized "hosts" could easily have more processing power than an individual human, or even than the combined power of all human brains together. I'm not so sure you can model a brain with the computers we have now, at any level of complexity or power. Human thought seems to be a quantum process, and our quantum computers are still very primitive. It may be hundreds of years before we need to worry about any such thing.
  8. Maeve / Felix can modify her code with no training? Even with a 20+ bulk apperception, I have trouble believing that Maeve and or Felix can alter her own programming without any training. Really smart humans take months to learn coding and years to master it. I've got to believe the hosts' code is pretty complex. It should take years to develop the skills necessary, so Maeve, who's total life experience has been 19th century, even with Felix' help, shouldn't have been able to do more than create a mess.
  9. Erasing a hosts memory doesn't work! When you delete a file from a computer, it stays deleted. In Westworld, Ford deletes a host's unpleasant memory, but it keeps coming back in flashbacks. That's not how computer memory works.
  10. Caviat for machine learning and holographic memory. Learning systems like artificial neural networks and memory systems like holographic memory have some of the traits of hosts' minds. A neural network may perform a function without a clear procedural explanation of how that function is calculated. A holographic memory can't be easily erased, as every part holds a version of the whole. But neither of these facts squares with Westworld. If you know you have a holographic memory, you don't go around harboring the illusion that you can easily erase it. And "analysis" mode, is useless for artificial neural networks. If there's analysis mode, then the code must be procedural or rule-based.

Westworld is great story-telling. It dramatizes existential and epistemological questions in the context of a great story. If you're inspired by the "creator" power of a Robert Ford character, you might consider getting involved in coding yourself. It's not as dramatic as Westworld, but it is awesome. And who knows? If I'm wrong, and the robot-apocalypse is coming soon, maybe you should know more about how our future overlords are made. Just kidding!