Quote:
Originally Posted by AaronJ
See, here's where we disagree ...
1) Data didn't have emotions. Yes, the way in which Spiner portrayed him occasionally showed him as having emotions, in a very limited fashion. But Data, itself, did not really have emotions.
|
Sure he did. Or more to the point I was making, we were meant to believe he did. We weren't simply allowed to anthropomorphize Data, we were encouraged to. It wasn't simply a matter of Spiner's portrayal. Entire episodes and storylines were devoted to Data's emotional state and we were meant to empathize with him. We were meant to believe his desire to be more human was genuine. We were meant to believe he experienced some sense of loss when Tasha died and that his relationships with his circle were some form of friendship.
And that's par for the course with AI protagonists.
[Show spoiler]
Quote:
Originally Posted by AaronJ
Caleb was naive. And I don't blame him, as I probably would have fallen for the same thing. But that doesn't change the fact that he expected silicon to somehow inherit human emotions (let's not get started on what I think about human emotions, but still ...).
|
Is it really naïve to think a silicon-based life form could feel something akin to human emotions or is it arrogant to believe we are so complex and so special and that humanity is such a beautiful and unique snowflake that no hunk of silicon, no matter how sophisticated, could feel things the way we do?
And this ties into one of the coolest things about this movie. AI stories very often accept as a given that an machine can't really be intelligent unless it's very much like a human and it can't truly be human unless it 'feels' the way we do and this movie says 'well, now, not so fast' -and- it not only rejects (or at least questions) that premise, it uses it to have a lot of fun at our expense.
Quote:
Originally Posted by AaronJ
Think of it this way: There's a reason why a human being under the age of 18 can't be diagnosed as a sociopath. Because ALL humans under 18 are essentially sociopaths. And Ava isn't even "under 18." She's the equivalent of a baby, but with a lot more ability to express herself and do things. No one would expect a 1 year-old to put someone else's well-being ahead of hers'. So I don't understand why anyone would expect the same from Ava.
|
Why do you assume she's the equivalent of a baby? Her sophistication clearly extends beyond expressing herself and doing things. She obviously grasps abstract concepts of self and other. She knows other beings exist as independent entities and that these entities think and feel and can suffer. She's clearly not the equivalent of a human infant.
Subsequent events suggest rather strongly that she doesn't care about the well-being of others but nothing in the film suggested (let alone made obvious) that she was incapable of caring about the well-being of others.
Unless of course you reject the very possibility out of hand but that kind of renders the whole exercise kind of pointless to begin with, no?
Quote:
Originally Posted by AaronJ
She's not good. She's not evil. She's not ANYTHING. She has her own interests, and everything happening around her is as relevant to her as your neighbor's garage light turning on or off.
|
Well, yeah. And just to clarify my original point, I think having her more deliberately put Caleb in danger could have severely undercut that impression. Having her intentionally trap him or react in some way to him being trapped (a smirk or a shrug, for instance) could have created the impression that she was ruthless rather than unemotional.
Quote:
Originally Posted by kylec123
I've always found arguing for or against the possibility of "Ava-like" AI to be pointless.
|
Agreed. And further, it's pointless on two levels.
As you say, we don't know what machines/programs will be capable of ten years from now let alone twenty or fifty or a hundred years from now. And we don't know nearly enough about how we work to conclude with any kind of confidence that our inner workings can't be replicated by some kind of machine.
But beyond that, whether it's possible or not is largely - if not totally - irrelevant anyway. 'What if' are two of the most magical words in the English language. Is time travel possible? Maybe not but what if it is? Is it possible to reanimate dead tissue? Maybe not but what if it is. Will it ever be possible to travel faster than light or power an entire civilizations with sea water or extend the human lifespan indefinitely? Who the hell knows. But what if it is?