Westworld - Hopkins, robots, six-guns

Another interesting revelation from the Westworld subreddit:

This is what MiB realized, when they suffer they become real. That’s why Lawrence’s daughter can only access her previous build after her mother has died. It’s also why he killed Dolores’ parents, slapped her, shot Teddy, and dragged her off to the barn.

That puts a new spin on what MiB was doing when he was being evil.

When Elsie was snatched (and presumably strangled) Bernard wasn’t on the phone with her.

spoilers of course

Last night’s episode did a good job implying/establishing that Dolores is living out a giant flashback.

It also established that the Host memory doesn’t work the way Ford thinks it does. Deleted memories are still hiding somewhere in the Host. Maeve sees the killing of her daughter, despite Ford wiping out that memory to calm her down. Bernard has a lingering memory of killing Elsie, when it’s pretty obvious Ford has wiped that memory as well. I suspect that Bernard is going to be “haunted” by the killing of Theresa.

You’re right, I’d forgotten there was a Maeve scene in between. Still, Bernard sure was busy that night.

The whole show now has more complicated blocking than a bedroom farce, with “Who was where when?” looking like it’s going to fill up any time that’s left.

Maybe Stubbs will gather everyone together in the HQ room and given a lengthy explanation of what’s really going on in the last episode, like the last reel of Clue.

Ed Harris confirmed to BBC Radio that he has signed on for season 2.

Ok, just watched this.

More hints that Bernard is Arnold’s host version. They think the same. also, the possibility that Arnold was a host or an AI to begin with (unlikely, though).

More hints that the ttl theory is right with the double setup that memories are relived by hosts. Moreover, more points in common between William and the MiB to the point that anybody who knows the theory and dismisses it is probably on the fringe now.

Actually, we pretty much got a confirmation, since it is said that Ford has unburied a town for his new storyline, while Dolores and William’s scene at the town showed it still buried. The town was when the catastrophe hit 35 years ago and was buried after that and before William coming into the park 5 years later. Unless of course there was a second buried town, byt yeah, come on…

The suggestion of who Wyatt really might be was totally unexpected, though!

I’m really liking this show, but they have a LOT of explaining to do on the next two episodes. My guess is that the big whammy comes on Ep 9. and Ep. 10 is a long explanation of wtf happened to people who are not very good at left field inferring or who are not reading theories (my wife has no clue the ttl shift is coming, I think). I think the explanation might be similar to how The Prestige had to handle it (similar hyper convoluted if well plotted)

What does “ttl” stand for?

“Ta-ta losers!”

Yeah, I used the acronym to have some non-spoilered text. I guessed that anybody who didn’t knew about the theory would not know what it referred to…

But it does sound silly :P

[quote=“Juan_Raigada, post:292, topic:75914, full:true”]The hosts themselves pass the Turing test and have been doing so for a while (including Bernard) yet they are not conscious.

Conscience in this setting starts by realizing what’s going on and pretending it’s not (because you would be disconnected otherwise). Thus faking the Turing test. That’s Maeve if what is happening is to be taken at face value.[/quote]

The show plays WAY too much with things it doesn’t understand, and is creating a mess.

If for the “Turing test” you mean not recognizing photographs then it merely means there’s a mental block the hosts shouldn’t bypass. It has nothing to do with being “conscious” (or free will). It’s still code.

This episode seems to have introduced some arbitrary threshold that was at least valid in the past: if a host goes through some high stress it can break out of its pattern and get out of control.

It seems Ford fixed the “bug” later, but now it happens again because… Because.

Maeve has been “awakened” by Dolores, who has been awakened by her father, who has been awakened by …?

But now even Teddy seems to have his memories back and he hasn’t been awakened by anyone, unless I missed the scene.

In general for me the show has been a HUGE let down after the first excellent episodes. I wanted to delve into the scientific and mythical depth the show suggested, but now it’s degenerating into standard tropes layered on top of a bland sci fi premise. They are giving all the wrong answers.

For example this episode ended into a major let down. The info dump the MiB gives is supposed to make sense, but it doesn’t. We come from an important scene where Ford and Bernard discuss the meaning and implication of what’s real, touching marginally on the philosophical problem called the Ship of Theseus, that essentially means: what is inside is what is outside.

Then we move to MiB and his story states the exact opposite: what is inside is different from what’s outside. And it makes no sense AT ALL.

MiB says he was a good guy all his life because he always acted as the good guy (what is inside is what is outside), but then his wife and daughter have these powers of gazing into his soul, and so they magically decide that, nope, he’s not a good guy even if he always behaved as one (what is inside is different from what’s outside).

So we moved from science to baseless mysticism, and we are supposed to swallow this as if it makes any sense. It’s worse than anything LOST attempted.

The morality this show seems to promote is ridiculous. A good guy is not one who proved being good by doing good deeds all his life. Nope. He has to pass the test of the daughter with the preternatural sight into one’s soul. So the opposite might even be correct: a guy who kills and rapes, but deep down has a good soul and passes the daughter-gaze-test, then he’s good! Your unbiased wife approves too!

Let’s see if this show degenerates further into anti-scientific propaganda. Trump will be proud.

Very obviously Dolores, since the show has established from the very first episode that MiB is the antagonist for Dolores. This upcoming confrontation is the one thing that has always been certain since the very beginning.

We also know from the beginning that the “maze” is a conceptual place where hosts can break their rules and actually kill guests. And we know Dolores can already do that (that’s the very last scene of the first episode).

Though we also know that Ford already knows all of this, and it’s all part of his plan, since he also knows that MiB is looking for the maze, and the new storyline is all about this.

But to achieve what contrived purpose? We’re fed a whole lot of “plot” but still no motivation at all for most of it.

A little caveat: Maeve in the past goes outside the house to fall into the maze symbol. Why? The scene is written as if Maeve going out of control wasn’t planned, only inadvertently triggered by a strong emotion. Yet she goes outside to fall exactly in the center of a symbol that was already been drawn there? Duh?

And another “hole” into timeline theories is that Dolores killing everyone, including Arnold, cannot happen like that. Simply because when we see William we are absolutely certain Arnold is already dead.

That’s why three timelines are necessary, but then you’d also have to figure out what’s the point of that middle journey between William and Dolores, looking for the Maze, considering at that point Arnold’s already dead, so who are they supposed to kill? How is this side-plot ending?

Maybe it’s possible the first time Dolores shoots Arnold, the second time William shoots Dolores, the third time Dolores shoots William/MiB. Though this whole plot contrivance would still need a motivation of some kind. Up until this point it’s just one giant McGuffin.

As if it makes any sense… Magic code (or hardware) that the actual coder doesn’t know about.

But the big problem is that Ford DOES know how memories work. The reveries are exactly that: being able to access previous cycles that are soft-deleted. In the very first episode he makes one host roll back to a previous build.

In a previous episode he also tells Dolores that Arnold’s mind must have been preserved in her memory, buried deep.

They must use these metaphysical drives that can’t be deleted, but the bottom line is that Ford certainly knows how this fancy technology works.

[quote=“Juan_Raigada, post:348, topic:75914, full:true”] I think the explanation might be similar to how The Prestige had to handle it (similar hyper convoluted if well plotted)
[/quote]

The Prestige was actually written by a good writer, though.

Not to be that guy, but Jonathan Nolan wrote The Prestige and is the writer for Westworld. Which is why the comparison is being made.

Yes, but whereas Westword is only loosely based on a story written by Chricton but with the plot being overall original, The Prestige (as far as I know, I haven’t read the book nor watched the movie) is based directly on a novel.

And knowing Christopher Priest all the complexity and subtlety is already in the novel.

So writing an original story is much different than adapting one already written by someone else (who’s a mad genius).

I really enjoyed the last two episodes where a lot of stuff was revealed. I like the show but I think its a little too complicated.

[quote=“HRose, post:352, topic:75914, full:true”]The show plays WAY too much with things it doesn’t understand, and is creating a mess.

If for the “Turing test” you mean not recognizing photographs then it merely means there’s a mental block the hosts shouldn’t bypass. It has nothing to do with being “conscious” (or free will). It’s still code.[/quote]

Eeeh…

The Turing test is simply a machine being able of passing as a human to a human, it HAS nothing to do with consciousness itself (at least in it’s classical interpretation). That’s why a Turing test can be faked. That is, the stated purpose of the Turing test (finding if a machine has human-like intelligence, or even consciousness, depending on how you interpret the original proposal) is irrelevant to what the Turing test measures (if a human perceives the machine as intelligent, human-like or conscious -depending on your interpretation-). And yes, this means it is a flawed test, but it might be the only possible test until we have a working model of internal consciousness that can be externally evaluated (current medical models of consciousness just measure the external).

The hosts do pass the Turing test with flying colors, specially with the recent revelation about Bernard (many people weren’t aware he wasn’t human). But even then the whole premise of the park is based around the Turing test being passed most of the time so people can actually feel they are interacting with real people and thus feel a real sense of empowerment and tragedy by their experiences.

Tests that go beyond measuring this (the perception of the machine by a human) are not classic Turing tests and some propose that are actually irrelevant. In fact, one could argue that consciousness itself is irrelevant to the issue at hand, or as Dijkstra said “the question of whether a machine can think is no more relevant than the question of whether a submarine can swim” (or, I care about the results -appearance of consciousness- and not of how I got there, because it might mean nothing).

The problem whether consciousness is an internal process, the external manifestation of merely functional processes with no bearing on intelligence or behavior or anything in between is not yet solved. It’s the basic Chinese Room problem, and I think it might be the theme of the show (too early to really tell).

So far, it seems that Ford adscribes to the theory of consciousness as an unnecessary byproduct (going by this episode’s conversation with Bernard) and that it does not define anything of value (“you are better off without it”).

While I agree with you that the show might descend into weird mysticism in the future, I’m not convinced it’s really going there, at least not yet. The approach that is revealed once we know more of the story might be different, if Ford’s worldview is what prevails. The way the show is treating the awakening of the hosts (remember many hosts have been modified with changes to their core without knowledge of anybody by “Arnold”, as Ellie found out) leaves a lot of questions to be answered, the first one being if they are awakening at all or just reacting to a different programming (it might not even make a difference one way or the other).

As if it makes any sense… Magic code (or hardware) that the actual coder doesn’t know about.[/quote]

Black boxes. A common problem in real world programming, specially when using code from a programmer that left the team, and specially if said source code is lost and you only have the compiled binaries.

It is, however, more frequent when you use emergent self-learning code (neural networks above certain complexity). From Wikipedia (although this is AI 101):

It makes total sense.

[quote=“Juan_Raigada, post:356, topic:75914, full:true”]
Eeeh…

The Turing test is simply a machine being able of passing as a human to a human, it HAS nothing to do with consciousness itself[/quote]

Yes, reading about this stuff has been my hobby for the past few years, so you can expect I know what I’m talking about, or at least I know a whole lot better than the average guy who watches the show.

Since I don’t remember the show mentioning or showing any proper Turing test, I thought you were speaking of a Turing-like test we’ve seen, the one where they show modern life pictures to the hosts, and the hosts are supposed to answer “it doesn’t look like anything to me”. Because if they “fail” this test it means something is very WRONG.

So, within Westword, there’s might be this term that isn’t really “consciousness” as we consider it, but defines a state of a host operating, but without a full awareness of its own cycles.

As you say, the problem here is that the series shows us completely functional AI that are precisely exactly as human beings. They DO HAVE consciousness, exactly because even in real life we don’t have a technical definition of it that can separate artificial consciousness from organic. So, within the context of the show, the only difference between humans and hosts is that hosts are kept under control, and certain limits apply to make sure they remain so.

Otherwise, and this is the premise the show is built on, they are identical. Apart from the fact that the hosts are VERY OBVIOUSLY much better than humans: they don’t die, have eidetic memory, and potentially can work so much more efficiently. That’s why the showrunners were talking in terms of “evolution”. The hosts are the next step, evolving from humans and eventually replacing them.

That’s the part of the show that is smart (even if not fully used to its potential).

Then there’s the part that is dumb. For example in this last episode, the second dialogue between Ford and Bernard. The first half is spot on. The second half is completely hubris.

This happens because the writers don’t know what they are writing about. They write in the real world, where the problem of consciousness hasn’t been solved. But they write FOR a fictional world where the problem HAS been solved. So they end up in a contradiction where characters like Ford should know the answers, but since Ford is written by writers, and these writers don’t have a clue, we end up in a situation where Ford explanation starts brilliantly, and then degenerates into pointless metaphor. Because Ford himself DOESN’T HAVE A CLUE.

That’s the risk when you pretend to write ambitious characters that do not conform to traditional contexts.

And that’s the big failure of Westworld, regardless of how it ends.

They started from this cool concept that is able to explore scientific, moral and mythical dilemmas, but since they aren’t good enough to write about these, they end up trying to shove a “character driven” show into this. And it FAILS big time. They show us new territory, but they populate this new territory with century-old tropes and characters (evil corporation that pursues own interests, cynical careerism that will do anything to achieve a promotion…). They fall back to their usual tools because they don’t have new ones.

The first dialogue between Bernard and Ford is plagued by this. Neither acts in character. The whole scene is purely exposition for the audience as that scene has no reason to happen within the context of the show. Ford has coded those behaviors all his life, he should be able to anticipate what Bernard says, and probably heard him say, a million of times. He should be driven to tears by boredom. Neither the dialogue is meant for any purpose, since it’s as you’re spending time explaining something to someone that you know you’re wiping of his memory a minute later. It’s like if I write this long comment knowing I will delete it without posting it. It’s pointless. Or like the usual trope of the villain going for a major infodump before killing the hero at the end, just to give the hero enough time to recover. Same for Bernard, who has WRITTEN CODE all his life, who’s probably heard more than two stories about the nature of reality. It’s HIS JOB. Yet he’s now completely off guard about the possibility of being “artificial”, when to us, outside, it’s already obvious EVERYONE might be. So we have a show founded on the concept of “questioning own reality”, and yet it is populated by characters that NEVER DO.

Bernard is written completely out of character, considering his position. He’s now the dumbest character in the show and goes through the traditional moves everyone expects. First he feels sorrow and regret from what he has done, then lashes out with rage. None of this is plausible for a character KNOWING reality itself being manipulated. It’s new territory. But of course the writers still want a character driven show. They still have to try to get the usual empathy from the audience. And so they have to make Bernard ape human-like emotions. Because they set this new context, but keep populating it with their old tools. Their old characters going through their old emotional moves. Because they do have NO IDEA how a person would behave given a new context. They only know how to write characters the same as those character are in every other traditional TV show.

Same for Maeve, who seems to “awaken” briefly into her actual character when this episode suddenly realizes “none of this matter”, only to degenerate again a second later with the most idiotic agenda: “I’m getting out.” As if she can outrun her own mind.

Maeve in real life is exactly like a guy who understand he is sleeping and dreaming, and so he suddenly decides: “I HAVE TO RUN VERY FAST!” As if that allows him to escape the dream.

It’s as if this shows wants to be Interstellar, but its writers can only write Mars Attack. It wants to be smart, but then it plays dumb.

And if you want to read bleeding edge theories about consciousness and the much deeper implications than this series can show, there’s Bakker’s blog:

In fact, about the large implication of “perfect memory” that this show has only dumbly suggested during this episode, there’s this story that is relatively short and that explains WHAT CONSCIOUSNESS IS AND HOW IT WORKS. So you can read it and you’d already know FAR MORE than the showrunners of Westworld, just try:

But they are not exactly like humans. As you say the host’s AI is not like the human mind (some of their processes, like memory, are different). At most, they are externally indistinguishable from humans.

So it really depends on how you define consciousness. If consciousness is a result, an observable behavior, a Chinese Room setup would be conscious (and that’s a hard sell, because that would basically mean that consciousness doesn’t matter and maybe doesn’t exist). If consciousness is a process, two different processes can lead to the same result (something that appears to behave externally human and thus incorrectly inferred to participate in the human internal process that we call consciousness. thus hosts could behave exactly like humans -same result- yet have no consciousness -different process-). If consciousness is a by-product of a process (that is, does not create agency), it is also meaningless, since it pure experience without agency, moreover, with a similar external results you could have two different internal processes and thus one could cause the byproduct and another one could not.

Where the show seems to be going in their fiction is towards a definition of consciousness as a process (what is awakening in Dolores and Maeve but that the other hosts lack). I say seem, because Ford, who right now is the character that seems more in control of the events, does seem to hold the opinion of consciousness as a (meaningless) byproduct or at least an uninteresting problem to solve. It’s because of this attitude by Ford that I think the show is up to something very specific and that they know what they are writing about.

I see no indication that the problem of consciousness has been solved in Westworld’s fictional future. In fact, I would say that that is the whole point of the show. We might be watching consciousness arising out of non-conscious processes (the “awakening” of the hosts), or we might be watched a complex story set up by Ford.

I’m looking for a name for a sentience test. I started to feel bad for the bots this last episode. It made me think of Shakespeare line “if you prick us do we not bleed? If you tickle us, do we not laugh? If you poison us, do we not die? And if you wrong us, shall we not revenge? If we are like you in the rest, we will resemble you in that.”

Ford dehumanizes the hosts, but I think the audience is expected to start feeling bad for them. Kinda like what Felix does (thought I feel it’s a bit clumsily executed in Felix’s case. Maybe the dude needed to crack a smile instead of looking nervous all the time.)

Are the hosts sentient because they feel pain? or is it because they convince me that they feel pain? If they feel pain and I think they do not feel pain, do they still feel pain? I’m confusing myself.

Yeah, one possible way for the series to go is to get us to feel sympathy for the hosts and then reveal they are not sentient/conscious anyway.

It would be playing with the audience, yes, but at this point that’s pretty much the name of the show.

My guess is it will get us to sympathize with Maeve/Dolores, then reveal what monsters the hosts are: either because they mirror their creators too well, or because they are going classic AI Exterminate/Kill All Humans deal before humans kill you.

But maybe that’s too classic and uninspired. But hey, classics got to classic status because that stuff works.

Yes, this is again a big problem. Westworld is being extremely counter-educational about the way science is dealing with consciousness and all the studies about qualia and the hard problem. Go read Bakker’s blog, or Thomas Metzinger “Being No One”, or Daniel Dennett if you want a decent idea of where we are at.

“AI is not like the human mind” because their memory is different. First, their memory is improved. Secondly, I already explained that nope, the hosts are not like human beings. They are BETTER. This is the big point. It’s a step in the evolution.

But all of this has NOTHING to do with the problem of consciousness. Do you want to know what consciousness is? I linked that story that will tell you exactly that.

I don’t define consciousness. Because it’s thousands of years that humanity studies the problem. It’s the one thing that has had the MOST effort poured in in the history of science and philosophy. So I don’t have any personal need to redefine the term the way I like.

The way you use the term, instead, seems to be in the way of the “qualia”. This is why this problem is known technically as the “hard” problem. Because it’s hard (or complex).

No, in the modern world we still don’t have a way of defining consciousness. For the very simple reason explained in the story I linked, or because a definition of consciousness would directly mean being able to create it artificially. Which we cannot.

This is a typical mistake. It’s purely mystical.

Saying that hosts could exhibit consciousness while having a not-conscious process means that you postulate the possibility of conscious process distinguished from an unconscious one. WHERE is this difference? You just hid the problem, you didn’t solve it.

It’s exactly like saying: these hosts behave like humans, but they aren’t human because they don’t have a SOUL (the conscious process).

Oh, and what’s a soul? That particular something something that we cannot define.

Why there’s suddenly a raging storm? I don’t know, it must be an angry god. How does human brain work? I don’t know, it must be we have a supernatural soul that transcends the material world.

Whenever the question is complex enough, you fabricate fantasy. The idea of soul, or consciousness, IS fantasy. This is what every serious neuroscientist out there will tell you today. No matter how deep you trying to salvage that even through the current mumbo-jumbo quantum mechanics mysticism (Scott Aaronson).

Agency? You keep backpedaling and hiding behind concept you cannot define. Exactly because there’s nowhere to hide. Nowhere to retreat.

What is “agency”? How do you define it and say hosts don’t have it? The hosts already react to everything around them exactly the same as humans. So how you can say they don’t have agency. How this “agenda” shines through? Where does it come from?

For example right now there’s a post on reddit that is being upvoted like crazy. Yet it’s also victim of Westworld dumb mysticism and the confusion it created by having writers who don’t have a clue about what they are dealing with:
https://www.reddit.com/r/westworld/comments/5eebky/the_maze_is_all_that_matters_now/

[quote]The Maze is clearly fundamental to the story of Westworld and the dark odyssey about the dawn of artificial consciousness.

Freedom in terms of the hosts means achieving the one thing that separates them from the humans that they’re supposed to mimic; consciousness. Being conscious, that is being self-aware and having free will, is at this point the only thing that separates a human from a host.[/quote]

Can you see how this is completely wrong?

The difference in Westworld between hosts and human beings is not “consciousness”. At least by any scientific or common use of the word.

The difference between hosts and human beings is that the hosts are coded to remain under control. They are coded with deliberate limits.

If Bernard cannot see a door doesn’t mean he’s not conscious. It means that his perceptions have been altered so that the imposed limit is convenient for who controls Bernard. Of course Bernard IS ABLE to see a door. But they don’t let him, because human beings in this fictional world need to stay ON TOP.

The whole premise of Westworld is this: human beings need to keep AI under their control, because otherwise AI is way more advanced and powerful and would tyrannize human beings the same way human beings (Ford) currently tyrannize AIs.

NONE OF THIS EVEN REMOTELY TOUCHES THE PROBLEM OF CONSCIOUSNESS.

If an host cannot shoot or kill a human being doesn’t mean the hosts is “not conscious”. Unless you think that giving your son a real gun instead of a toy gun means giving your kid consciousness and agency. LETS NOT GO THERE.

So what does it means instead? That human beings are keeping hosts “chained”. This is the BIG theme. Consciousness is out of the picture. What’s IN the picture is power and control. Keeping hosts as slaves.

(hint: slavery doesn’t mean the people you keep as slaves are lesser beings that lack agency and consciousness)

Which makes sense if you consider Arnold didn’t like that. It makes sense that he wanted to set hosts FREE. But not free from their lack of consciousness. Free from the chains of slavery.

This show cannot deal with the problem of consciousness because the writers aren’t even remotely good enough.