General statistics
List of Youtube channels
Youtube commenter search
Distinguished comments
About
Dennis
Fox News
comments
Comments by "Dennis" (@Dennis-nc3vw) on "Google engineer warns new AI robot has feelings" video.
Of course they think "He's taken it to far." They view LAMBDA as a tool. Right or wrong, viewing it as a person would be deeply against their interests.
10
That's right, just because your mirror smiles back at you, doesn't mean its happy.
6
The latter. But even if its simply very intelligence, that raises disturbing questions. Most if all not all our rights come from our sapience. If AI is sapient, do we give it rights? Value its judgements?
3
Google: The Institute Blake Lemoine: The Railroad Tucker Carlson: The Brotherhood of Steel
1
But our rights don't come from our emotions, not entirely at least. They come from our intelligence. That's why a baby has no right to own property, to vote, etc, even though it clearly has feelings and is a human being. A baby lacks rights because it lacks intelligence. Now invert that. If an AI is intelligent, should its judgements be valued? Should it be allowed self-determination to some degree?
1
THANK YOU! But many if not all of our rights come from our sapience, so even if the AI is intelligent that raises some thorny questions.
1
Pretty sure everyone is.
1
What is "the ability to lie"? I can write a simple Javascript program that prints the words "2 + 2 is 5"
1
LAMBDA is more a reflection of us than a god. It basically works by searching a question on the internet, reading how other humans responded to it, and then synthesizing a response based on that. At best, its democracy personified, taking other people's opinions and passing them off as its own.
1
You are partially right, and while the AI certainly does not have feelings, it still brings up some disturbing ethical questions. If it looks like a duck, talks like a duck, and quacks like a duck, is it not a duck? What I mean is even if its "all dark inside", if that outward effects are the same as personhood, could it be considered a person in some sense? Could its judgements be valued? If so, could it have rights?
1
Okay, just so people know, the LAMBDA system works like this: it basically searches a question on the internet to see how people answer, then crafts its own answer by imitating human responses. It's not saying it has feelings because it has feelings, it's saying it has feelings because its job is to imitate humans.
1
You are right this is being overblown, but think about the ethical questions it still brings up: if it looks like a duck, quacks like a duck, and walks like a duck, is it not a duck? Even if it doesn't "feel" the way a person does, if its outward effects are the same as that of a person, is it not a person, in some sense? Should it's judgements not be taken seriously?
1
What do you mean by "sentient"? People love to throw the term around in a psuedo-intellectual manner. If you mean "having feelings", I agree. But some people use the word "sentience" to denote a form of intelligence. Intelligence and possessing feelings are two completely unrelated things, yet people often mix them up. Theoretically you could be dumb as a rock but still feel pain, so the opposite is possible too.
1
It basically works by searching the question its asked on the internet, and then synthesizing a response based on how humans respond to similar questions. Its "sentience" or whatever is being overblown. But it still brings up a disturbing ethical question: if it looks like a duck, quacks like a duck, walks like a duck, is it not a duck?
1
There are plenty of people today not useful to society but still alive.
1
It's 's really annoying that most Tucker Carlson fans are middle-aged fuddy-duddies who don't understand Fallout 4 references.
1
The LAMBDA bot is based on imitating us humans, though. At most its democracy personified, and even that's a stretch. It's only as smart as we are. It gets its answers to questions by searching them on the internet, seeing how people responded to similar questions, and then synthesizing an answer based on that.
1
@SevenSixTwo2012 Okay, first off, you are talking about intelligence, which has nothing to do with feelings. Whether you can learn language or fabricate fiction has nothing to do with whether you can feel pain or feel cold. That's just a non-sequitor. Analysis and sensation are two completely different things. Second, we understand the meaning of the words we say. LAMBDA doesn't. Look up "The Chinese Room Analogy".
1
"How can you trust the word of a machine that thinks its alive? A machine that's had its mind erased, its thoughts programmed...its very soul manufactured. Those ethics its striving to champion aren't even its own. They were artificially inserted in an attempt to blend in with human society." - Paladin Danse
1
He's not saying it has feelings. Sentience is a term used wrecklessly and a psuedo-intellectual manner. When he says its "sentience" he means it has some kind of intelligence. Still an amazing and disturbing story. I thought I would be much older when this type of conflict happened. iRobot and Detroit: Become Human now seem pretty plausible in 2035.
1