Tuesday, October 26, 2021

Do androids have a first person perspective?

 Would androids have a first person perspective. My computer, running a chess program, can beat me in chess. But it doesn't seem to know the joy of victory or the agony of defeat, though we might be able to program it to behave as if it does.

29 comments:

bmiller said...

Of course androids can have first person perspective.

Their thinking parts are made of silicon derived from sand. If you arrange enough sand in just the right way and shoot electricity through it you get a person. The same way the rest of us became persons. The question is, exactly at what point did the sand become a person.

bmiller said...

This is how your brain was made

One Brow said...

The real question is, could we program the computer so that it learns on its own to act as if it is feeling joyful or in agony, and if we do, could we say with certainty that those feelings are not genuine?

One Brow said...

bmiller,

Thank you for the video.

Is personhood a binary notion, that you either have completely or nothing at all?

Kevin said...

Is personhood a binary notion, that you either have completely or nothing at all?

Legally we do pretty much selectively impart partial personhood onto animals based on how similar they are to us from a neurological perspective. No one would get arrested for pulling the wings off a fly, but pull the ears off a dog and you're going to jail. Then again we experiment on apes. Selective indeed.

What qualities would an AI have that grant it personhood that could not equally be applied to apes, dogs, whales, ravens, pigs, bears, parrots, elephants, or even octopuses? Any animal with sufficient intelligence seems to be capable of grasping the concept of self and experiencing genuine emotions.

bmiller said...

The real question is, could we program the computer so that it learns on its own to act as if it is feeling joyful or in agony, and if we do, could we say with certainty that those feelings are not genuine?

If we told an actor that he needs to act like a lion and he does a good enough job to get people to believe him. Is he really a lion?

bmiller said...

No one would get arrested for pulling the wings off a fly, but pull the ears off a dog and you're going to jail.

I wonder why you brought up flies and torturing dogs.

One Brow said...

Kevin,
Then again we experiment on apes. Selective indeed.

We also experiment on humans.

The closer an animal is to us evolutionarily, the tighter the protocols around the experiment and the more assurance has to be given to the IRB that the animals will be treated humanely.

bmiller,
If we told an actor that he needs to act like a lion and he does a good enough job to get people to believe him. Is he really a lion?

That seems to be a confusion of ontology with the possession of specific properties.

bmiller said...

Are the actor's properties not genuine lion properties?

One Brow said...

bmiller,

Humans and lions share many properties. In what way does your question relive the tension between ontology and the possession of properties?

bmiller said...

One Brow,

You posed this question:

The real question is, could we program the computer so that it learns on its own to act as if it is feeling joyful or in agony, and if we do, could we say with certainty that those feelings are not genuine?

I just substituted actor for computer and lion person properties for human person properties and asked the same question. If your question doesn't involve ontology then neither does mine. What does "genuine" mean in your question and why should we be curious if something exhibiting a property is genuine or not?

bmiller said...

Sorry. The last question should be:

What does "genuine" mean in your question and why should we be curious if anything exhibiting a particular property makes the property "genuine" or not?

One Brow said...

bmiller,
I just substituted actor for computer and lion person properties for human person properties and asked the same question...

Except, I didn't say the feelings in question would be human feelings, nor that the computer in question would be a human. Of course the feeling of such a putative computer would be substantially different from human feelings, just like human feelings are different from lion feelings. That does not make them unreal.

However, I'm not really sure how to define genuine here, or if such a thing could even be detectable. That's part of the reason I phrased it as a negative, saying it would be hard to us to deny the reality.

bmiller said...

Victor's OP involved first person experiences, joy and agony and computers.

I assumed he meant that joy and agony are feelings that human persons experience and indeed what makes a human a person (from past discussions). He asked what if we could program it to mimic the feelings of a human person and that wording seems to imply that it still couldn't have first person experiences even if it could mimic them. Therefore even if it could mimic them, they would still not be a person.

I assumed then that you were asking if a machine "on it's own" started to act as if it had human personality traits if that would make it a genuine human person since it had seemingly genuine human person traits.

But it seems your definition of genuine relates to something being real versus unreal, not genuine traits as opposed to mimiced traits. I'm not sure how you define real and unreal, but I assume that if a thing acts as if it feelings then the acting is real/genuine. Likewise if a thing was not acting, that too would be real/genuine. I don't see any tension in the question at all otherwise.

Starhopper said...

To me the (second) most interesting thing* about consciousness is that it IS unprovable - not just for androids, or animals, but even for other people. The only consciousness we can ever be 100% sure of is our own.

Now I happen to think (without proof) that consciousness is more widespread than we generally allow for. C.S. Lewis in several places asserted his own belief that humankind was the sole possessor of consciousness - that animals, for instance, had zero self awareness, and did not even know they existed. My view (again without proof) is that most, if not all, animals are self aware. And I would not be surprised to find that plants were conscious. However, if a rock (or an electron) were self aware, that would surprise me.

* The most interesting thing is that it exists at all.

One Brow said...

bmiller,

I will try to be more clear in the future. I think we agree that if we programmed a computer to imitate human feelings, than any displays of feelings are almost certainly the result of the programming.

It seems we also agree that a computer might actually possess feelings in a way that had similarities to humans, but also fundamental differences.

bmiller said...

I agree to the first sentence. Not the second.

One Brow said...

bmiller,

I seem to have misinterpreted "I'm not sure how you define real and unreal, but I assume that if a thing acts as if it feelings then the acting is real/genuine. Likewise if a thing was not acting, that too would be real/genuine. I don't see any tension in the question at all otherwise."

If an AI was behaving as if it had real feelings, and it was not acting, would such feelings be real or genuine?

bmiller said...

What's a real feeling and what distinguishes a real feeling from a false feeling?

bmiller said...

Also, what is a feeling?

One Brow said...

I agree those are both valid questions. Do you have an answer? If so, could you provide a definition of feeling that does not rule out machines deliberately? If not, what basis could we have for saying machines can't be capable of feelings?

bmiller said...

You asked me a question containing the terms "real", "genuine" and "feeling". There is no way I can give an answer unless I understand what you mean by those terms. I just don't understand what you're asking.

bmiller said...

It's not just that I don't know what those terms mean to you but you've asked 'if something is acting as if it's doing x, then is it genuinely doing x'. It sounds like 'if something is not really doing x then is it really doing x?'

That doesn't make sense to me. Of course I don't know what it is that is under srutiny of being real or unreal in the first place so I'm at a loss.

bmiller said...

There are no coincidences.

Today is National Frankenstein Day.

Mary Shelley was only 18 when she wrote it!

One Brow said...

All I've been saying is that is a machine has not been programmed to simulate feelings, but nonetheless behaves as if they have feelings or reports having them, we have no way of ruling that out as being genuine feelings.

It sounds like 'if something is not really doing x then is it really doing x?'

Sometimes you can act like you are feelings things you are not feeling, sometimes you act a certain way because you have that feeling.

bmiller said...

Still no definition of "genuine" so you're right that there's no way of ruling the condition in or out.

Artificial Intelligence is called artificial is because it is not natural intelligence. I suppose then that if a machine appears to behave as if it had emotions, the appropriate term would be artificial emotions.

From Wikipedia:
Artificial intelligence (AI) is intelligence demonstrated by machines, as opposed to natural intelligence displayed by animals including humans. Leading AI textbooks define the field as the study of "intelligent agents": any system that perceives its environment and takes actions that maximize its chance of achieving its goals.[a] Some popular accounts use the term "artificial intelligence" to describe machines that mimic "cognitive" functions that humans associate with the human mind, such as "learning" and "problem solving", however this definition is rejected by major AI researchers.[b]

So it appears (to Wikipedia at least) that AI has a narrower definition than what is being assumed in this thread. I doesn't look like emotions fit into the definition at all.

bmiller said...

Here's a thought experiment.

A man behaves like a lion. Would we think he is experiencing the same experiences that a lion experiences? If not what would we think?

A lion behaves like a man. Same question. If not, what would we think was going on.

One Brow said...

bmiller,

The popular definition is rejected as being too narrow. Mimicking human cognitive methods is merely one approach of artificial intelligence.

Different men would experience different things when behaving like a lion.

Lions don't seem to have the cognitive capacity to behave like men.

As long as we don't confuse artificial emotion with simulated emotion, I don't object tot he term.

bmiller said...

True. Some science fiction writers seem to reject the opinion of the engineers working on AI. But then again, science fiction writers are not constrained by the realities engineers are constrained by. Likewise some genres of science fiction have stories about men becoming lions and lions becoming men. They can all make entertaining stories although we have to suspend our disbelief.