01-31-2016, 01:03 PM
|
#555
|
Banned
Jul 2013
Michigan
|
Quote:
Originally Posted by octagon
Sure he did. Or more to the point I was making, we were meant to believe he did. We weren't simply allowed to anthropomorphize Data, we were encouraged to. It wasn't simply a matter of Spiner's portrayal. Entire episodes and storylines were devoted to Data's emotional state and we were meant to empathize with him. We were meant to believe his desire to be more human was genuine. We were meant to believe he experienced some sense of loss when Tasha died and that his relationships with his circle were some form of friendship.
And that's par for the course with AI protagonists.
|
But that was the fault of the show, which I loved btw. It's bad writing, as far as I'm concerned. There were episodes where Data was shown to desire emotions, desire humanity, and there were episodes where he was shown to already be in possession of those qualities.
I thought that the storylines concerning Data were fascinating and well-watchable. But that doesn't mean that they made sense. They didn't.
[Show spoiler]
Quote:
Is it really naïve to think a silicon-based life form could feel something akin to human emotions or is it arrogant to believe we are so complex and so special and that humanity is such a beautiful and unique snowflake that no hunk of silicon, no matter how sophisticated, could feel things the way we do?
|
I think it's naive, yes. She fools him. Her entire approach has one goal. And he falls for it. That's the definition of naive.
Quote:
And this ties into one of the coolest things about this movie. AI stories very often accept as a given that an machine can't really be intelligent unless it's very much like a human and it can't truly be human unless it 'feels' the way we do and this movie says 'well, now, not so fast' -and- it not only rejects (or at least questions) that premise, it uses it to have a lot of fun at our expense.
|
Nathan, ******* that he is, even says it to Caleb: Is she just pretending to like you? She's a machine which uses Caleb. Nathan shows Caleb that he tricked him. Listen to what he says: She is going to use her intelligence and sexuality and all the rest to make Caleb appreciate her AI.
Quote:
Why do you assume she's the equivalent of a baby? Her sophistication clearly extends beyond expressing herself and doing things. She obviously grasps abstract concepts of self and other. She knows other beings exist as independent entities and that these entities think and feel and can suffer. She's clearly not the equivalent of a human infant.
|
OK, I had that wrong and I admit it. But she's not a person. She's a machine. And, yes, we're meant to feel for her. That's the point. Just like Nathan said, she's created to attract us. Her reason for existing is to "move on." And Nathan should have known that, and I'm sure he did at some level. But she's not good or evil, right or wrong -- she just ... is.
Quote:
Subsequent events suggest rather strongly that she doesn't care about the well-being of others but nothing in the film suggested (let alone made obvious) that she was incapable of caring about the well-being of others.
Unless of course you reject the very possibility out of hand but that kind of renders the whole exercise kind of pointless to begin with, no?
|
I think you're confusing the ability to care for others with the desire for self-preservation. Every living being ... artificial or otherwise ... has a natural desire to preserve itself. Ava is no different. I mean, my iMac is not intelligent enough to care for its existence. Ava is. But I don't see them as that far apart.
Quote:
Well, yeah. And just to clarify my original point, I think having her more deliberately put Caleb in danger could have severely undercut that impression. Having her intentionally trap him or react in some way to him being trapped (a smirk or a shrug, for instance) could have created the impression that she was ruthless rather than unemotional.
|
Yeah, I don't think she was ruthless. She was just trying to survive, like any being.
Quote:
Agreed. And further, it's pointless on two levels.
As you say, we don't know what machines/programs will be capable of ten years from now let alone twenty or fifty or a hundred years from now. And we don't know nearly enough about how we work to conclude with any kind of confidence that our inner workings can't be replicated by some kind of machine.
But beyond that, whether it's possible or not is largely - if not totally - irrelevant anyway. 'What if' are two of the most magical words in the English language. Is time travel possible? Maybe not but what if it is? Is it possible to reanimate dead tissue? Maybe not but what if it is. Will it ever be possible to travel faster than light or power an entire civilizations with sea water or extend the human lifespan indefinitely? Who the hell knows. But what if it is?
|
OK, fair enough.
[Show spoiler]But I think the idea of Ava in the "regular" world at the end is scary. Or it would be scary for regular people, if they knew what was going on. She's obviously manipulative. And of course everyone is going to help the "cute, lost girl." But if nothing else, you have to admit that she's a sociopath. She has no empathy. She has no concern for others. She's scary.
|
|
|