As an Amazon associate we earn from qualifying purchases. Thanks for your support!                               
×

Best Blu-ray Movie Deals


Best Blu-ray Movie Deals, See All the Deals »
Top deals | New deals  
 All countries United States United Kingdom Canada Germany France Spain Italy Australia Netherlands Japan Mexico
The Mask 4K (Blu-ray)
$35.00
4 hrs ago
Outland 4K (Blu-ray)
$31.32
1 hr ago
Dogtooth 4K (Blu-ray)
$22.49
9 hrs ago
Hard Boiled 4K (Blu-ray)
$49.99
 
In the Mouth of Madness 4K (Blu-ray)
$36.69
 
Casino 4K (Blu-ray)
$29.99
 
Spawn 4K (Blu-ray)
$31.99
 
Creepshow: Complete Series - Seasons 1-4 (Blu-ray)
$68.47
1 day ago
Back to the Future 4K (Blu-ray)
$29.96
 
The Toxic Avenger 4K (Blu-ray)
$29.96
 
A Nightmare on Elm Street Collection 4K (Blu-ray)
$96.99
 
Airport: The Complete Collection 4K (Blu-ray)
$86.13
 
What's your next favorite movie?
Join our movie community to find out


Image from: Life of Pi (2012)

Go Back   Blu-ray Forum > Movies > Movies
Register FAQ Community Calendar Today's Posts Search


View Poll Results: Rate the Movie *After You've Seen It!*
One Star 1 0.42%
Two Stars 4 1.67%
Three Stars 32 13.39%
Four Stars 101 42.26%
Five Stars 101 42.26%
Voters: 239. You may not vote on this poll

Reply
 
Thread Tools Display Modes
Old 01-30-2016, 04:05 PM   #541
tiger_qc tiger_qc is offline
Blu-ray Ninja
 
tiger_qc's Avatar
 
Mar 2008
Quebec City, Canada
69
173
1422
221
86
73
84
Default

Quote:
Originally Posted by solarrdadd View Post
your parents made you, don't you have, feelings? my parents made me and I have them. perhaps that clever subroutine that she was given/developed/used that appeared to be "feelings" is the same if not similar type of "subroutine" that we were given/developed/use to have and show feelings.

I think the greatest thing about that movie, is how it has all of us talking about it with in many instances very different points of view about what it means, what it meant. that my friend, is great film making!

I think, we need more of it; no, I'm sure of it!
Alright I meant something built, my parent didn't built me, I maybe was "made" by them but they never decided how I would look or think like something that was built.

Quote:
Originally Posted by Drookur View Post
This is the whole point of any discussion around AI and, to some extent, the point of the Turing Test. Can a computer reasonably approximate the responses of a human such that a blind participant wouldn't know they're interacting with a computer. And if so, what does that mean for the "rights" of that robot. In the movie, Caleb even knows that Ava is a robot but she is so utterly convincing to him that he begins to see she is being denied the basic rights of human life - to be free from captivity and free from torment.
[Show spoiler]Which results in her escape and his (implied) demise.


Now, once you start discussing whether a robot can have "real" emotions or if it's just going to be a series of acted out, preprogrammed 0s and 1s, you're getting into philosophical theories of mind and conceptualizations about consciousness which are very difficult to answer, even by philosophers today. However, one could argue that the electrical impulses in set pathways in our brains that produce feelings and emotions are no different than that of a highly, highly advanced and complex machine like Ava.

Or, like the poster above said, you could just not worry about any of it and hope to realize your dream of having sex with a robot.
I never said that, you're talking about Ray Jackson.

Quote:
Originally Posted by Ray Jackson View Post
I've always wanted to have sex with a robot.

It's been a dream of mine since I was a kid.

...Ex Machina did nothing to dissuade that.
I'm good with real human girls, I'll pass!
  Reply With Quote
Old 01-30-2016, 04:23 PM   #542
spectre08 spectre08 is offline
Blu-ray Knight
 
spectre08's Avatar
 
Feb 2015
Dallas, TX
538
25
49
Default

Quote:
Originally Posted by tiger_qc View Post
Alright I meant something built, my parent didn't built me, I maybe was "made" by them but they never decided how I would look or think like something that was built.
but as far as her intelligence and personality goes Ava is essentially the same way. Nathan built the hardware, and gave her basic framework and guidelines (not unlike your parents did as they raised you), but then switched her on to the outside world and let her "observe" humans to learn how to be human...not unlike exactly how human learn to be human.
  Reply With Quote
Old 01-30-2016, 05:03 PM   #543
Drookur Drookur is offline
Special Member
 
Drookur's Avatar
 
Jan 2013
169
118
Default

Quote:
Originally Posted by tiger_qc View Post
Alright I meant something built, my parent didn't built me, I maybe was "made" by them but they never decided how I would look or think like something that was built.



I never said that, you're talking about Ray Jackson.



I'm good with real human girls, I'll pass!

I know. That's why I said "poster above." I was on mobile so I didn't get the username and just left it vague. Sorry if there was a misunderstanding.
  Reply With Quote
Old 01-30-2016, 05:16 PM   #544
AaronJ AaronJ is offline
Banned
 
Jul 2013
Michigan
47
624
2
1
Default

Quote:
Originally Posted by octagon View Post
While I doubt Caleb would agree, I think that's a crucial distinction.

[Show spoiler]The fact that she shows no regard whatsoever for his well-being was imo far more powerful than if she had been a deliberate agent of what was presumably a pretty horrible fate.

That was the moment when it became apparent that she might actually be a truly alien life form that wasn't like the generations of fictional AI protagonists that preceded her. It's hard to think of an AI protagonist that didn't have some semblance of higher order human emotions. Data had emotions (even without the ridiculously named emotion chip), replicants had emotions, Cylons had emotions, David and Gigolo Joe had emotions.

I thought the film was great prior to that. The premise was interesting, the mystery elements were a really cool layer on top of that and the fact that we (through Caleb) were being played could have been a cheap M Knightish gimmick but it was very well thought out and extremely well executed.

But for me the moment she walked down that hall without so much as a thought for Caleb elevated the whole thing from great to brilliant.


I'll be very interested to see how it holds up to subsequent viewings but I was very impressed and think the people who say we've seen all this a thousand times before are really underestimating it.
See, here's where we disagree ...

[Show spoiler]1) Data didn't have emotions. Yes, the way in which Spiner portrayed him occasionally showed him as having emotions, in a very limited fashion. But Data, itself, did not really have emotions.

2) Ava did exactly what I expected her to do. I don't expect her to have feelings or thoughts of right or wrong anymore than I do my iMac or my iPhone. Yeah, I can give Siri a British accent, or say funny things to her to get a response, but when it comes right down to it, she is some programming on a computer chip.

Caleb was naive. And I don't blame him, as I probably would have fallen for the same thing. But that doesn't change the fact that he expected silicon to somehow inherit human emotions (let's not get started on what I think about human emotions, but still ...).

Think of it this way: There's a reason why a human being under the age of 18 can't be diagnosed as a sociopath. Because ALL humans under 18 are essentially sociopaths. And Ava isn't even "under 18." She's the equivalent of a baby, but with a lot more ability to express herself and do things. No one would expect a 1 year-old to put someone else's well-being ahead of hers'. So I don't understand why anyone would expect the same from Ava.

She's not good. She's not evil. She's not ANYTHING. She has her own interests, and everything happening around her is as relevant to her as your neighbor's garage light turning on or off.


Anyways, that's the way I see it.
  Reply With Quote
Old 01-30-2016, 05:21 PM   #545
Rodney-2187 Rodney-2187 is offline
Blu-ray Prince
 
Jan 2014
31
416
149
717
367
764
730
82
Default

Quote:
Originally Posted by tiger_qc View Post
I don't think something made can have feelings, period.
Sometimes people and things become more than what they originally started out as. AI isn't going to go from nothing to something with the flip of a switch. I think it will be a gradual process involving learning, growth and experience. Once it starts, an artificial intelligence could rapidly become much, much more than it ever was when it began. All it would take is self awareness and values, and those could be products of learning through trial and error.
  Reply With Quote
Old 01-30-2016, 05:26 PM   #546
AaronJ AaronJ is offline
Banned
 
Jul 2013
Michigan
47
624
2
1
Default

Quote:
Originally Posted by rodneyfaile View Post
Sometimes people and things become more than what they originally started out as. AI isn't going to go from nothing to something with the flip of a switch. I think it will be a gradual process involving learning, growth and experience. Once it starts, an artificial intelligence could rapidly become much, much more than it ever was when it began. All it would take is self awareness and values, and those could be products of learning through trial and error.
Evidence?

Because I don't see it.
  Reply With Quote
Old 01-30-2016, 05:34 PM   #547
Rodney-2187 Rodney-2187 is offline
Blu-ray Prince
 
Jan 2014
31
416
149
717
367
764
730
82
Default

Quote:
Originally Posted by AaronJ View Post
Evidence?

Because I don't see it.
Scientists have created algorithms that let computers learn. Learning is growth, knowledge obtained on its own that was not present when originally made. If that growth becomes more complex, there is no telling what it could become. One day we may see.
  Reply With Quote
Old 01-30-2016, 05:39 PM   #548
AaronJ AaronJ is offline
Banned
 
Jul 2013
Michigan
47
624
2
1
Default

Quote:
Originally Posted by rodneyfaile View Post
Scientists have created algorithms that let computers learn. Learning is growth, knowledge obtained on its own that was not present when originally made. If that growth becomes more complex, there is no telling what it could become. One day we may see.
Computers can only "learn" up to a point. They aren't really capable of learning as we understand it -- and that's IF we're even capable of learning as we understand it.

But no computer or phone or tablet or television has ever become "more" than it was designed as. It is what it is. It has x humber of processors and y amount of memory, etc. There's nothing it can ever do to increase any of that.
  Reply With Quote
Old 01-30-2016, 05:42 PM   #549
Ray Jackson Ray Jackson is offline
Blu-ray Duke
 
Ray Jackson's Avatar
 
Apr 2013
The dark underbelly of Anytown, USA
102
455
9
74
183
Default

Quote:
Originally Posted by tvine2000 View Post
Well Ray you can do that now ,its in the works in Japan,look it up on you tube (sex robots)
  Reply With Quote
Old 01-30-2016, 05:50 PM   #550
kylec123 kylec123 is offline
Power Member
 
kylec123's Avatar
 
Dec 2015
TN, USA
212
19
35
Default

I've always found arguing for or against the possibility of "Ava-like" AI to be pointless. We are quite far from achieving such in the world of AI research. There are several components to an "Ava-Like" AI system that are still open problems. Scalability and generality are very unsolved currently. Current research is pretty far from human like intelligence so there's no possible way to know what will happen. Maybe in 20-30 years we will see deep neural network breakthroughs that will allow for scaling generality. Maybe it will be knowledge base learning breakthroughs giving us the power to give machines the ability to learn new concepts. The way I see it, there's two main approaches to advance AI right now, each having several missing components, which make it impossible to really make a claim for or against the matter. It's fun to discuss the possibility, but at the end of the day its all up in the air.

End rambling. Source: I've studied AI and machine learning for the past 3 years, and continue to do so.
  Reply With Quote
Old 01-30-2016, 05:52 PM   #551
Rodney-2187 Rodney-2187 is offline
Blu-ray Prince
 
Jan 2014
31
416
149
717
367
764
730
82
Default

Quote:
Originally Posted by AaronJ View Post
Computers can only "learn" up to a point.
For now.

Quote:
Originally Posted by AaronJ View Post
But no computer or phone or tablet or television has ever become "more" than it was designed as. It is what it is. It has x humber of processors and y amount of memory, etc. There's nothing it can ever do to increase any of that.
Yet.

Technology has increased exponentially. You just never know what may happen unintentionally, not to mention those who are actively attempting certain goals. I don't think it is impossible a machine could gain enough knowledge to become self-aware, or to even have feelings like fear. Our brains are just chemical computers.
  Reply With Quote
Old 01-30-2016, 05:53 PM   #552
Nuck Horris Nuck Horris is offline
Special Member
 
Nuck Horris's Avatar
 
Dec 2015
Valkenvania
18
924
487
126
21
Default

Quote:
Originally Posted by Ray Jackson View Post
I've always wanted to have sex with a robot.

It's been a dream of mine since I was a kid.

...Ex Machina did nothing to dissuade that.
  Reply With Quote
Old 01-30-2016, 06:10 PM   #553
Rodney-2187 Rodney-2187 is offline
Blu-ray Prince
 
Jan 2014
31
416
149
717
367
764
730
82
Default

Quote:
Originally Posted by ray jackson View Post
i've always wanted to have sex with a robot.

It's been a dream of mine since i was a kid.

...ex machina did nothing to dissuade that.
rosie_jetson.jpg
  Reply With Quote
Old 01-31-2016, 06:24 AM   #554
octagon octagon is offline
Blu-ray Prince
 
octagon's Avatar
 
Jun 2010
Chicago
255
2801
Default

Quote:
Originally Posted by AaronJ View Post
See, here's where we disagree ...

1) Data didn't have emotions. Yes, the way in which Spiner portrayed him occasionally showed him as having emotions, in a very limited fashion. But Data, itself, did not really have emotions.
Sure he did. Or more to the point I was making, we were meant to believe he did. We weren't simply allowed to anthropomorphize Data, we were encouraged to. It wasn't simply a matter of Spiner's portrayal. Entire episodes and storylines were devoted to Data's emotional state and we were meant to empathize with him. We were meant to believe his desire to be more human was genuine. We were meant to believe he experienced some sense of loss when Tasha died and that his relationships with his circle were some form of friendship.

And that's par for the course with AI protagonists.

[Show spoiler]
Quote:
Originally Posted by AaronJ View Post
Caleb was naive. And I don't blame him, as I probably would have fallen for the same thing. But that doesn't change the fact that he expected silicon to somehow inherit human emotions (let's not get started on what I think about human emotions, but still ...).
Is it really naïve to think a silicon-based life form could feel something akin to human emotions or is it arrogant to believe we are so complex and so special and that humanity is such a beautiful and unique snowflake that no hunk of silicon, no matter how sophisticated, could feel things the way we do?

And this ties into one of the coolest things about this movie. AI stories very often accept as a given that an machine can't really be intelligent unless it's very much like a human and it can't truly be human unless it 'feels' the way we do and this movie says 'well, now, not so fast' -and- it not only rejects (or at least questions) that premise, it uses it to have a lot of fun at our expense.

Quote:
Originally Posted by AaronJ View Post
Think of it this way: There's a reason why a human being under the age of 18 can't be diagnosed as a sociopath. Because ALL humans under 18 are essentially sociopaths. And Ava isn't even "under 18." She's the equivalent of a baby, but with a lot more ability to express herself and do things. No one would expect a 1 year-old to put someone else's well-being ahead of hers'. So I don't understand why anyone would expect the same from Ava.
Why do you assume she's the equivalent of a baby? Her sophistication clearly extends beyond expressing herself and doing things. She obviously grasps abstract concepts of self and other. She knows other beings exist as independent entities and that these entities think and feel and can suffer. She's clearly not the equivalent of a human infant.

Subsequent events suggest rather strongly that she doesn't care about the well-being of others but nothing in the film suggested (let alone made obvious) that she was incapable of caring about the well-being of others.

Unless of course you reject the very possibility out of hand but that kind of renders the whole exercise kind of pointless to begin with, no?

Quote:
Originally Posted by AaronJ View Post
She's not good. She's not evil. She's not ANYTHING. She has her own interests, and everything happening around her is as relevant to her as your neighbor's garage light turning on or off.
Well, yeah. And just to clarify my original point, I think having her more deliberately put Caleb in danger could have severely undercut that impression. Having her intentionally trap him or react in some way to him being trapped (a smirk or a shrug, for instance) could have created the impression that she was ruthless rather than unemotional.


Quote:
Originally Posted by kylec123 View Post
I've always found arguing for or against the possibility of "Ava-like" AI to be pointless.
Agreed. And further, it's pointless on two levels.

As you say, we don't know what machines/programs will be capable of ten years from now let alone twenty or fifty or a hundred years from now. And we don't know nearly enough about how we work to conclude with any kind of confidence that our inner workings can't be replicated by some kind of machine.

But beyond that, whether it's possible or not is largely - if not totally - irrelevant anyway. 'What if' are two of the most magical words in the English language. Is time travel possible? Maybe not but what if it is? Is it possible to reanimate dead tissue? Maybe not but what if it is. Will it ever be possible to travel faster than light or power an entire civilizations with sea water or extend the human lifespan indefinitely? Who the hell knows. But what if it is?
  Reply With Quote
Old 01-31-2016, 01:03 PM   #555
AaronJ AaronJ is offline
Banned
 
Jul 2013
Michigan
47
624
2
1
Default

Quote:
Originally Posted by octagon View Post
Sure he did. Or more to the point I was making, we were meant to believe he did. We weren't simply allowed to anthropomorphize Data, we were encouraged to. It wasn't simply a matter of Spiner's portrayal. Entire episodes and storylines were devoted to Data's emotional state and we were meant to empathize with him. We were meant to believe his desire to be more human was genuine. We were meant to believe he experienced some sense of loss when Tasha died and that his relationships with his circle were some form of friendship.

And that's par for the course with AI protagonists.
But that was the fault of the show, which I loved btw. It's bad writing, as far as I'm concerned. There were episodes where Data was shown to desire emotions, desire humanity, and there were episodes where he was shown to already be in possession of those qualities.

I thought that the storylines concerning Data were fascinating and well-watchable. But that doesn't mean that they made sense. They didn't.

[Show spoiler]

Quote:
Is it really naïve to think a silicon-based life form could feel something akin to human emotions or is it arrogant to believe we are so complex and so special and that humanity is such a beautiful and unique snowflake that no hunk of silicon, no matter how sophisticated, could feel things the way we do?
I think it's naive, yes. She fools him. Her entire approach has one goal. And he falls for it. That's the definition of naive.

Quote:
And this ties into one of the coolest things about this movie. AI stories very often accept as a given that an machine can't really be intelligent unless it's very much like a human and it can't truly be human unless it 'feels' the way we do and this movie says 'well, now, not so fast' -and- it not only rejects (or at least questions) that premise, it uses it to have a lot of fun at our expense.
Nathan, ******* that he is, even says it to Caleb: Is she just pretending to like you? She's a machine which uses Caleb. Nathan shows Caleb that he tricked him. Listen to what he says: She is going to use her intelligence and sexuality and all the rest to make Caleb appreciate her AI.

Quote:
Why do you assume she's the equivalent of a baby? Her sophistication clearly extends beyond expressing herself and doing things. She obviously grasps abstract concepts of self and other. She knows other beings exist as independent entities and that these entities think and feel and can suffer. She's clearly not the equivalent of a human infant.
OK, I had that wrong and I admit it. But she's not a person. She's a machine. And, yes, we're meant to feel for her. That's the point. Just like Nathan said, she's created to attract us. Her reason for existing is to "move on." And Nathan should have known that, and I'm sure he did at some level. But she's not good or evil, right or wrong -- she just ... is.

Quote:
Subsequent events suggest rather strongly that she doesn't care about the well-being of others but nothing in the film suggested (let alone made obvious) that she was incapable of caring about the well-being of others.

Unless of course you reject the very possibility out of hand but that kind of renders the whole exercise kind of pointless to begin with, no?
I think you're confusing the ability to care for others with the desire for self-preservation. Every living being ... artificial or otherwise ... has a natural desire to preserve itself. Ava is no different. I mean, my iMac is not intelligent enough to care for its existence. Ava is. But I don't see them as that far apart.

Quote:
Well, yeah. And just to clarify my original point, I think having her more deliberately put Caleb in danger could have severely undercut that impression. Having her intentionally trap him or react in some way to him being trapped (a smirk or a shrug, for instance) could have created the impression that she was ruthless rather than unemotional.
Yeah, I don't think she was ruthless. She was just trying to survive, like any being.






Quote:
Agreed. And further, it's pointless on two levels.

As you say, we don't know what machines/programs will be capable of ten years from now let alone twenty or fifty or a hundred years from now. And we don't know nearly enough about how we work to conclude with any kind of confidence that our inner workings can't be replicated by some kind of machine.

But beyond that, whether it's possible or not is largely - if not totally - irrelevant anyway. 'What if' are two of the most magical words in the English language. Is time travel possible? Maybe not but what if it is? Is it possible to reanimate dead tissue? Maybe not but what if it is. Will it ever be possible to travel faster than light or power an entire civilizations with sea water or extend the human lifespan indefinitely? Who the hell knows. But what if it is?
OK, fair enough.

[Show spoiler]But I think the idea of Ava in the "regular" world at the end is scary. Or it would be scary for regular people, if they knew what was going on. She's obviously manipulative. And of course everyone is going to help the "cute, lost girl." But if nothing else, you have to admit that she's a sociopath. She has no empathy. She has no concern for others. She's scary.
  Reply With Quote
Old 01-31-2016, 04:09 PM   #556
Drookur Drookur is offline
Special Member
 
Drookur's Avatar
 
Jan 2013
169
118
Default

Quote:
Originally Posted by AaronJ View Post
See, here's where we disagree ...

[Show spoiler]1) Data didn't have emotions. Yes, the way in which Spiner portrayed him occasionally showed him as having emotions, in a very limited fashion. But Data, itself, did not really have emotions.

2) Ava did exactly what I expected her to do. I don't expect her to have feelings or thoughts of right or wrong anymore than I do my iMac or my iPhone. Yeah, I can give Siri a British accent, or say funny things to her to get a response, but when it comes right down to it, she is some programming on a computer chip.

Caleb was naive. And I don't blame him, as I probably would have fallen for the same thing. But that doesn't change the fact that he expected silicon to somehow inherit human emotions (let's not get started on what I think about human emotions, but still ...).

Think of it this way: There's a reason why a human being under the age of 18 can't be diagnosed as a sociopath. Because ALL humans under 18 are essentially sociopaths. And Ava isn't even "under 18." She's the equivalent of a baby, but with a lot more ability to express herself and do things. No one would expect a 1 year-old to put someone else's well-being ahead of hers'. So I don't understand why anyone would expect the same from Ava.


She's not good. She's not evil. She's not ANYTHING. She has her own interests, and everything happening around her is as relevant to her as your neighbor's garage light turning on or off.


Anyways, that's the way I see it.
A diagnosis of sociopathy is actually a misnomer at any age - there's no true clinical diagnosis for sociopathy or psychopathy. Those are clusters of characteristics that lead to a certain designation (you're right about the 18 year-old cut off for antisocial personality disorder), but you have to demonstrate some lack of concern for social norms before the age of 18 as well - which means that "everyone under the age of 18 is a sociopath" is incredibly inaccurate. In fact, babies respond uniquely to other babies' cries of distress in a nursery, indicating some possible form of social reciprocity or empathy at a very, very early age.

I'll spoiler tag just in case:
[Show spoiler]All of this is to say that just because someone does something self-serving does not mean they are a sociopath or without feelings (even to say a sociopath is without feelings is inaccurate). However, you're right in suggesting that no reasonable person would probably have exploited Caleb to the extent that Ava did. That was the part that was meant to be so shocking. Turning against her abuser/captor is less shocking and more in line with what one would reasonably expect.


I won't go into what it means to feel and what the implications are if a machine can seemingly approximate feelings, but I appreciate that this movie has created so much discussion about these different topics. Pretty cool to hear your and other peoples' opinions on what Ava's behaviors meant and the implications of it. That's why it's one of my top for 2015
  Reply With Quote
Old 01-31-2016, 04:21 PM   #557
dgoswald dgoswald is offline
Blu-ray Knight
 
dgoswald's Avatar
 
May 2015
Erath
61
Default

Quote:
Originally Posted by Drookur View Post
[Show spoiler]However, you're right in suggesting that no reasonable person would probably have exploited Caleb to the extent that Ava did. That was the part that was meant to be so shocking. Turning against her abuser/captor is less shocking and more in line with what one would reasonably expect.
I truly never looked at it in the above manner, but now you mention it, I do think there's an element of
[Show spoiler]Caleb being party to the exploitation of Ava as she might consider it to be. I think there's an element of the audience being wrong footed when Caleb is locked in, considering he's arguably the protagonist, or at least the one character through whose eyes we see this particular world. The fact that Ava did distinguish between Nathan's cruelty and Caleb's part in that whole process (albeit she realised Caleb was a soft touch) meant there was something going on inside her head; if she were a mindlessly basic robot, she'd have left no survivors.
  Reply With Quote
Old 01-31-2016, 04:27 PM   #558
Drookur Drookur is offline
Special Member
 
Drookur's Avatar
 
Jan 2013
169
118
Default

Quote:
Originally Posted by dgoswald View Post
I truly never looked at it in the above manner, but now you mention it, I do think there's an element of
[Show spoiler]Caleb being party to the exploitation of Ava as she might consider it to be. I think there's an element of the audience being wrong footed when Caleb is locked in, considering he's arguably the protagonist, or at least the one character through whose eyes we see this particular world. The fact that Ava did distinguish between Nathan's cruelty and Caleb's part in that whole process (albeit she realised Caleb was a soft touch) meant there was something going on inside her head; if she were a mindlessly basic robot, she'd have left no survivors.

I didn't even realize I made that point until you made it explicit. Nice comment. It's an important distinction that could make the gray area a little grayer.
  Reply With Quote
Thanks given by:
dgoswald (01-31-2016)
Old 01-31-2016, 04:46 PM   #559
s2mikey s2mikey is offline
Banned
 
s2mikey's Avatar
 
Nov 2008
Upstate, NY
130
303
40
Default

Quote:
Originally Posted by AaronJ View Post
Computers can only "learn" up to a point. They aren't really capable of learning as we understand it -- and that's IF we're even capable of learning as we understand it.

But no computer or phone or tablet or television has ever become "more" than it was designed as. It is what it is. It has x humber of processors and y amount of memory, etc. There's nothing it can ever do to increase any of that.
Yep. Fact. It is also impossible for a computer to ever "feel" as humans do. They can be programmed to mimic or behave as if they have feelings but they will never feel genuine emotions. That's reserved strictly for living beings. And it's not arguable.
  Reply With Quote
Old 01-31-2016, 04:54 PM   #560
Drookur Drookur is offline
Special Member
 
Drookur's Avatar
 
Jan 2013
169
118
Default

Quote:
Originally Posted by s2mikey View Post
Yep. Fact. It is also impossible for a computer to ever "feel" as humans do. They can be programmed to mimic or behave as if they have feelings but they will never feel genuine emotions. That's reserved strictly for living beings. And it's not arguable.

It's most definitely arguable. Here, watch me argue it.

This link demonstrates how you can break learning down into fundamental algorithmic pieces. Not exceeding processor capacity or ingrained mechanics. It's the same process for humans but our brains are much more efficient at it. http://spectrum.ieee.org/automaton/r...youtube-videos

You can take this a step further by saying that emotions are just a more complex set of algorithms. True no tablet or phone can do this now but the theoretical and mechanical foundations exist.
  Reply With Quote
Thanks given by:
Tuco_76 (02-06-2016)
Reply
Go Back   Blu-ray Forum > Movies > Movies



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT. The time now is 08:05 PM.