Thursday, September 05, 2013

You can't argue with a zombie

A redated post.
A paper by Jaron Lanier.

35 comments:

Ilíon said...

It's true; one *can't* argue with a zombie. Only with agents can one argue (nor can one argue with an agent who declines to remain rational).

Ilíon said...

I think (not positive, though) that I've read this once before. Regardless, thanks for the link; it was a pleasure to read (or re-read).

Ilíon said...

It seems to me this 'zagnet' author is himself pretty close to accepting zombiehood. That is, he appears to be a materialist who somply doesn't want to take materialism to its logical conclusion.

Doctor Logic said...

Victor, how do you know you have intentionality? How do you know your thoughts are about anything?

Anonymous said...

Hey, ask him how he knows he has thoughts. Or how he can be sure he exists. Doubt everything, including that you can doubt. Reason demands it!

Doctor Logic said...

It's a serious question. What if you had a thought that wasn't about anything?

I imagine that such a thought would seem nonsensical or confused, upon later reflection. I'm asking how you tell the good thoughts from the nonsensical/confused ones. How do you know a good thought is about a thing and a bad thought isn't? What is it about your good thoughts that makes them about things?

I suppose you could say that any proposition containing words automatically is about things because the words themselves are about things, even if the proposition as a whole is nonsense. But we can bypass that little problem by asking what makes a word about a thing.

Mooquery.

What's that about? You don't know the word (I hope, cos I just made it up), but I have a meaning in mind. You can come to know the meaning of this word, by asking me questions or watching my usage of the word. At what point do you know what it means, i.e., what the word is about, how and why do you know it?

Anonymous said...

That's a different question than what you asked. A thought that lacked intentionality may be raw experience, like pain (see Valicella.) Though as always, people argue both sides.

Pain isn't a bad or confused thought on those accounts either. Just a different type.

Doctor Logic said...

That's a different question than what you asked. A thought that lacked intentionality may be raw experience, like pain (see Valicella.) Though as always, people argue both sides.

I don't think it's a different question, but let's follow your angle for a moment.

How do you know when a raw experience is about something versus not about something?

Anonymous said...

The proponents of said view argue that pain is not "about" anything. It's simply pain, raw experience. Whereas consciously focusing is about something. I suppose they'd say "Pinch yourself, and compare the experience of the pinch to your thinking about the pinching."

On the other hand, Brentano says that intentionality is the mark of the mental, such that intentionality and phenomenal experience go hand in hand without exception. I don't "think about pain" when pinched under that view, I suppose, but the experience is itself intentional in the relevant sense.

Doctor Logic said...

Wouldn't you say that, in order for me to know what a raw experience was about, I must recognize the raw experience as something familiar?

Anonymous said...

That depends on who you're asking. For Valicella and others, pain isn't "about" anything when experienced so. They contend it's raw phenomenal experience and divorced from intentionality.

Doctor Logic said...

My question doesn't presume that all experiences be recognizable or recognized.

Perhaps it is possible to avoid recognizing something, and instead focus on raw experience, pre-recognition, as it were.

The point is, I don't have a problem with there being experience divorced from intentionality. Indeed, I embrace the idea to a large extent. My question is about what it takes to add intentionality.

If I think *about* a rabbit, isn't it necessary that I would recognize a rabbit when (or if) I see one?

If you were somehow called upon to think about a gavagai, but you have no criteria for recognizing a gavagai when (or if) you encounter one, then wouldn't you lack intentionality with respect to a gavagai?

Victor Reppert said...

Well, I know that my thoughts are about something because my thoughts being about something are necessary for the possibility of scientific knowledge. Intentionality has a transcendental justification.

Doctor Logic said...

Victor,

The possibility of scientific knowledge isn't necessary, but I assume that isn't quite what you wanted to use as a transcendental justification. Of course, we can get more basic than scientific knowledge, and say knowledge or reasoned justification itself might require intentionality.

I'm all for that.

However, the possibility of reasoned justification (or scientific knowledge, or whatever) requires only that some thoughts be intentional. You can probably assemble a transcendental argument for the existence of some intentional thoughts, but I don't see a transcendental argument that *all* thoughts must be intentional. It's not necessary for all of them to be intentional, is it?

So my question stands: how do you tell the difference between thoughts that have intentionality and thoughts that lack intentionality?

William said...

dlogic,

Do you understand the difference between verbal and nonverbal thoughts?

Do you think that there are nonverbal concepts?

All verbal thinking is intentional. Some nonverbal thinking is intentional. Unless you disallow nonverbal concepts.

The problem here is that in text messages we cannot express nonverbal thoughts, just refer to them, I suppose.

Doctor Logic said...

William,

I think I agree with you, but how does what you're saying impact my point?

Language is generally intentional because we generally recognize what our words refer to, i.e., it's difficult to use the word "rabbit" without being able to recognize a rabbit. I would also recognize a jumping rabbit, so "the rabbit jumped" has intentionality for me. However, this doesn't affect my argument.

If learn to spell and pronounce the word "gavagai", but I have no way to recognize a gavagai, do I have intentionality with respect to gavagai just because I can pronounce the word? I don't think I do. I have to be able to recognize a gavagai were I to encounter one, or else the bet is off.

Doctor Logic said...

I'm getting a lot of counterpoints on this thread, but nothing directly relevant to my question. And I think the reason for this is clear. As soon as we say that recognition is required for intentionality, we have to throw out stupid examples of systems lacking intentionality (e.g., meteorite showers), and start asking difficult questions about non-trivial systems that are capable of learning to recognize stuff.

Anonymous said...

DL,

First, who is saying that "recognition is required for intentionality" in the sense you're using it here? Aristotleans argue that intentionality ("aboutness") is present in natural systems that aren't conscious. The problem is that if you recognize intentionality at all, even in a non-conscious system, you've abandoned materialism/naturalism in any meaningful senses of the words.

Now, at that point, sure - the discussion isn't over. And yes, for understanding the mind you still have to talk about "non-trivial systems" and so on. But that's where the game is one of choosing which of a variety of non-naturalist viewpoints you're going to pick up.

Of course, if that's the point you're at, well hey - welcome to the non-naturalist/non-materialist camp

William said...

dlogic;
==
I have to be able to recognize a gavagai were I to encounter one, or else the bet is off.
==

But by default a 'gavagi' is the current set of associations for the word 'gavagi' which at minimum includes the letters g, a, v, and i. So as soon as we create the word, we have rudimentary intentionality, at least about the verbal expression itself.

I think what you want is more something like the nonverbal thought of the bat _about_ the sonar 'signature' of the moth. The trouble is that we don't know what that is like, except maybe like an footstep's echo in the hallway, I suppose...

I do know something about what the word 'gavagi' is like, though.

And yes, I think that bats have qualia, and that computer programs, including imaginary ones that use meteor showers as executable data, don't.

ingx24 said...

Lanier's paper is somewhat hard to follow at times (I'm not sure if that's because of my unfamiliarity with the terminology or if his writing is just unclear), but his general approach seems very reminiscient of Searle's: the argument that computation is not an intrinsic part of reality but rather is mind-dependent. One part I find interesting is that Lanier essentially argues that computationalism is itself a kind of dualism:

"What interests me most is the ultimate position that zombies arrive at when this argument is driven to its conclusion. After abolishing ontological distinctions based on human epistemological difficulties, zombies invent new ontologies for the benefit of computers. Inside every zombie is a weird new kind of dualist.

The new weird dualism can take a number of forms, distinguished by the choice of meaningless code words, such as "emergent" or "semantics". But the hallmark of zombie dualism is the belief in the independent, objective existence of information and computers."


This is something I've been thinking for a while: If information/computation is an intrinsic part of reality, and the mind is essentially composed of information, doesn't that make the mind distinct from its material substrate after all, and therefore make some kind of dualism true? It's not the traditional Cartesian-style dualism where minds are created ex nihilo and attached to bodies by an outside force, but it still admits the existence of a non-material reality (information) that is the essence of mind.

What confuses me about Lanier's paper, though, is his positive theory: To be completely honest, I don't understand it at all. Lanier seems to be advocating some form of identity theory, but that, if anything, seems to be a bigger threat to the existence of consciousness than computationalism is - in fact, I would argue that identity materialism ultimately collapses into eliminativism. I have to admit I'm completely baffled.

im-skeptical said...

" If information/computation is an intrinsic part of reality, and the mind is essentially composed of information, doesn't that make the mind distinct from its material substrate after all, and therefore make some kind of dualism true?"

No. Information is physical.

Crude said...

ingx24,

If information/computation is an intrinsic part of reality, and the mind is essentially composed of information, doesn't that make the mind distinct from its material substrate after all, and therefore make some kind of dualism true?

This is where things get a little dicey. If 'information' and 'computation' are intrinsic to reality, then teleology is a rock-bottom constituent of reality - out goes modern materialism, out goes naturalism.

But I'm not sure you get to 'dualism' just from this. Part of the problem is that dualism absolutely depends on a particular conception of matter - what made Descartes' mental substance what it was had a lot to do with the conception of matter Descartes was working with.

I think the best way to put it may be that the falsity of metaphysical materialism or naturalism doesn't require the demonstration of the truth of dualism: you can also get there just by sacrificing the dualist's / naturalist's concept of matter itself. It's not like Berkeley being a monist rather than a dualist encourages that materialist/naturalist.

Nor Russell, really (though nowadays I wouldn't be surprised to see neutral monism getting clung to by those parties, just as panpsychism sometimes is.)

B. Prokop said...

"information is physical"

Wha-a-a-a-a-t?????

Have fun defending that one!

mattghg said...

I wonder if Doctor Logic is still following this thread.

If I think *about* a rabbit, isn't it necessary that I would recognize a rabbit when (or if) I see one?

No. I can think about Elms without being able to recognise them (at least one my own, perhaps use of the experts counts as recognition). I can think about pre-socratic philosophers, and there aren't any around to recognise.

how do you tell the difference between thoughts that have intentionality and thoughts that lack intentionality

Well here's a stab: thoughts that have intentionality are capable of featuring in reasoning---e.g., the thought 'I am in pain' can be the premise to a deductive argument, while the raw sensation of pain can't.

im-skeptical said...

"Wha-a-a-a-a-t?????

Have fun defending that one!"


Type the "information is physical" into Google. And then ask all those physicists to defend their claim. Here's a hint: their thinking generally isn't constrained by your Thomistic pre-scientific woo.

B. Prokop said...

You have it completely backwards, Skep. And instead of typing stuff into Google, I went to actual, real-live physicists and asked them about this very issue (see: the thread on this website about quantum mechanics and information at the subatomic level, several months back), and was told that at the most fundamental level, it's not that "information is physical", but that "physical is information" - a completely different ball o' wax!

im-skeptical said...

"I went to actual, real-live physicists and asked them about this very issue"

Gee, Bob, it's too bad you don't know anything about the topic without going to your physicist buddies. I asked you to look at Google to show you at a glance that there is widespread acknowledgement of it in the scientific community.

"it's not that "information is physical", but that "physical is information" - a completely different ball o' wax!"

I don't suppose you could provide a cogent explanation of that, could you?

mattghg said...

As soon as we say that recognition is required for intentionality, we have to throw out stupid examples of systems lacking intentionality (e.g., meteorite showers), and start asking difficult questions about non-trivial systems that are capable of learning to recognize stuff.

ISTM Lanier addresses this counterargument in his footnote 16.

mattghg said...

Gee, Bob, it's too bad you don't know anything about the topic without going to your physicist buddies

What, and you do??

Ilíon said...

ing(énue)24: "If information/computation is an intrinsic part of reality, and the mind is essentially composed of information, doesn't that make the mind distinct from its material substrate after all, and therefore make some kind of dualism true?"

I-assert-you-are-irrational-if-will-not-agree-with-my-denial-of-the-very-possibility-of-rationality: "No. Information is physical."

B.Prokop: "Wha-a-a-a-a-t?????"

Actually, there isn't that much difference between either of them -- they are both deniying the reality of *actual* minds: the salient difference is that 'I-pretend' does it directly, whereas 'I'm-a-whining-hypocrite' does it less directly.

ing(énue)24: "If information/computation is an intrinsic part of reality ..."

'Information is proposition(s) about something else -- only actually existing minds *can* create, much less think about, information. To put it another way, 'information' exists only "within" (as we say) some actually existing mind or other.

'Computation' is counting -- only actually existing minds *can* count.

ing(énue)24: "If ... the mind is essentially composed of information ...."

There is no such thing as "the mind" -- "the mind" is a concept, it is a universal -- there are only actually existing minds.

Minds are not "essentially composed of information"; to posit such a thing is:
1) to deny that minds even exist;
2) to deny that information even exists.

And this fool has the gall to assert that I haven't answered his pretend objections to the conclusion that (western-style) atheism and materialism are but two ways of making the same set of assertions.

ing(énue)24: "... doesn't that make the mind distinct from its material substrate after all, and therefore make some kind of dualism true?"

In the context of false premises, especially in the context of false premises that entail that there is no one who *can* in the first place, who really gives a damn?

=============
I-assert-you-are-irrational-if-will-not-agree-with-my-denial-of-the-very-possibility-of-rationality: "Information is physical."

B.Prokop: "Wha-a-a-a-a-t?????

Have fun defending
that one!"

I-assert-you-are-irrational-if-will-not-agree-with-my-denial-of-the-very-possibility-of-rationality: "Type the "information is physical" into Google. And then ask all those physicists to defend their claim. Here's a hint: their thinking generally isn't constrained by your Thomistic pre-scientific woo."

What a 'Science!' fetishist fool -- who neither begins to understand, nor cares to understand, actual science.

B. Prokop said...

"Gee, Bob, it's too bad you don't know anything about the topic without going to your physicist buddies."

Yup. That's exactly what I did, 'cause the last formal, classroom training I ever got in physics was in 1969. (I think a bit's happened in the field since then.) I do enjoy a good popularization now and then, and try to keep up with the latest trends - but yes, when it comes to asking the hard questions, I turn to the authorities. Fortunately I am surrounded by them in my astronomy club and via my connections with Johns Hopkins University and the Space Telescope Institute,both in Baltimore. (And yes, they are my "buddies".)

As to "could provide a cogent explanation of that?", I wish I could remember the name of the thread where we discussed this issue at length. It's all there in detail. The conversation started by my asking the question, "Where is the information stored at the elementary particle level that informs the particle how to behave in the presence of an exterior force?" I took the question to Dr. Ron Lee, physicist and authority in quantum mechanics. He informed me that I was looking at the problem "the wrong way around", and explained that when we considered what's going on at the most elementary particle level, we're not really dealing with matter at all, but pure information. So in essence, when you break things down to the very basics, matter (as we understand it) doesn't really exist - it's all information. Thus my expression, "It's not that information is physical, but that physical is information."

im-skeptical said...

"What, and you do??"

You bet I do.

Dan Gillson said...

Fill in some gaps, Ilíon:

1. What are actual minds, and how do im-skeptical and ingx24 deny them?

2. Is all information reducible to propositions? Or can information be reduced to some other speech act? E.g., if I share a relevant bit of information with my wife, namely that I love her, am I stating a proposition or am I making a confession?

3. Stating that minds are essentially composed of information seems to take for granted the existence of minds and the existence of information, so how does stating such deny that minds exist and that information exists?

4. What is actual science?

im-skeptical said...

"So in essence, when you break things down to the very basics, matter (as we understand it) doesn't really exist - it's all information."

And yet, incredulously, you asked me to defend the assertion that information is physical. That's a commonly accepted notion in physics. I would add that 'information' has different meanings in science. There is also information theory that is more pertinent to electronics, communication and computer science (and perhaps this discussion). That's what informs the philosophy of Dretske in "Knowledge and the Flow of Information".

B. Prokop said...

Order is important, Skep. Order is important. "Information is physical" is in no way the same thing as "Physical is information".