Tuesday, June 28, 2005

Part of "Defending the Dangerous Idea"

For my presentation in England,I have been working on my a paper entitled "Defending the Dangerous Idea" I've started work on the part of the paper having to do with intentionality. It may help to answer some objections that have come up on another thread:

I. The Argument from Intentionality
The first of the arguments that I presented is the Argument from Intentionality.
Physical states have physical characteristics, but how can it be a characteristic of, say, some physical state of my brain, that it is about dogs Boots and Frisky, or about my late Uncle Stanley, or even about the number 2. Can’t we describe my brain, and its activities, without having any clue as to what my thoughts are about?
To consider this question, let us give a more detailed account of what intentionality is. Angus Menuge offers the following definition:
1) The representations are about something.
2) They characterize the thing in a certain way
3) What they are about need not exist
4) Where reference succeeds, the content may be false
5) The content defines an intensional context in which the substitution of equivalents typically fails.
So, if I believe that Boots and Frisky are in the back yard, this belief has to be about those dogs, I must have some characterization of those dogs in mind that identifies them for me, my thoughts can be about them even if, unbeknownst to me, they have just died, my reference two those two dogs can succeed even if they have found their way into the house, and someone can believe that Boots and Frisky are in the back yard without believing that “the Repperts’ 13 year old beagle” and “the Repperts’ 8 year old mutt” are in the back yard.
It is important to draw a further distinction, a distinction between original intentionality, which is intrinsic to the person possessing the intentional state, and derived or borrowed intentionality, which is found in maps, words, or computers. Maps, for example, have the meaning that they have, not in themselves, but in relation to other things that possess original intentionality, such as human persons. There can be no question that physical systems possess derived intentionality. But if they possess derived intentionality in virtue of other things that may or may not be physical systems, this does not really solve the materialist’s problem.

It seems to me that intentionality, as I understand it, requires consciousness. There are systems that behave in ways such that, in order to predict their behavior, it behooves us to act as if they were intentional systems. If I am playing chess against a computer, and I am trying to figure out what to expect it to play, then I am probably going to look for the moves it think are good and expect the computer to play those. I act as if the computer were conscious, even though I know that it has no more consciousness than a tin can. Similarly, we can look at the bee dances and describe them in intentional terms; the motions the bees engage in enable the other bees to go where the pollen is, but it does not seem plausible to attribute a conscious awareness of what information is being sent in the course of the bee dance. We can look at the bees as if they were consciously giving one another information, but the intentionality as-if intentionality, not the kind of original intentionality we find in conscious agents. As Colin McGinn writes:
I doubt that the self-same kind of content possessed by a conscious perceptual experience, say, could be possessed independently of consciousness; such content seems essentially conscious, shot through with subjectivity. This is because of the Janus-faced character of conscious content: it involves presence to the subject, and hence a subjective point of view. Remove the inward-looking face and you remove something integral—what the world seems like to the subject.
If we ask what the content of a word is, the content of that word has to be the content for some conscious agent; how that conscious agent is understanding the word.
In reading Carrier’s critique of my book we find, in his response to the argument from intentionality, terms being used that make sense to me from the point of view of my life as a conscious subject, but I am not at all sure what to make of them when we start thinking of them as elements in the life of something that is not conscious. Consider the following:
Returning to my earlier definition of aboutness, as long as we can know that "element A of model B is hypothesized to correspond to real item C in the universe" we have intentionality, we have a thought that is about a thing.
Or
Because the verbal link that alone completely establishes aboutness--the fact of being "hypothesized"--is something that many purely mechanical computers do…
Or again
Language is a tool--it is a convention invented by humans. Reality does not tell us what a word means. We decide what aspects of reality a word will refer to. Emphasis here: we decide. We create the meaning for words however we want. The universe has nothing to do with it--except in the trivial sense that we (as computational machines) are a part of the universe.
Now simply consider the words, hypothesize and decide that he uses in these
I think I know what it means to decide something as a conscious agent. I am aware of choice 1 and choice 2, I deliberate about it, and then consciously choose 1 as opposed to 2, or vice versa. All of this requires that I be a conscious agent who knows what my thoughts are about. That is why I have been rather puzzled by Carrier’s explaining intentionality in terms like these; such terms mean something to me only if we know what our thoughts are about. The same thing goes for hypothesizing. I can form a hypothesis (such as, all the houses in this subdivision were built by the same builder) just in case I know what the terms of the hypothesis mean, in other words, only if I already possess intentionality. That is what these terms mean to me, and unless I’m really confused, this is what those terms mean to most people.

15 comments:

Ahab said...


Victor wrote:
"It seems to me that intentionality, as I understand it, requires consciousness. There are systems that behave in ways such that, in order to predict their behavior, it behooves us to act as if they were intentional systems. If I am playing chess against a computer, and I am trying to figure out what to expect it to play, then I am probably going to look for the moves it think are good and expect the computer to play those. I act as if the computer were conscious, even though I know that it has no more consciousness than a tin can. Similarly, we can look at the bee dances and describe them in intentional terms; the motions the bees engage in enable the other bees to go where the pollen is, but it does not seem plausible to attribute a conscious awareness of what information is being sent in the course of the bee dance. We can look at the bees as if they were consciously giving one another information, but the intentionality as-if intentionality, not the kind of original intentionality we find in conscious agents."




There is the imparting of real information from one bee to other bees. Those bees receiving the information are going to change their behavior based on it. That information is about something in the real world - it is one physical system (the dance) referring to another physical system (the actual 3-dimensional position of the pollen). Not only that, but a farmer may have come along in the meantime and mowed down those flowers that conatin the wonderful source of pollen. So the bee dance would be referring to something that no longer exists, just as in your example of the dogs in the backyard.

Sorry, Victor, but now your definition of 'intentional' sounds bogus to me. You are simply defining it in a way that is going to support your conclusion. Why does 'intentionality' need consciousness? And what is consciousness? And if consciousness is what is required for real intentionality then there simply is no need to focus on intentionality at all. It really just boils down to the AFC - argument from consciousness.

And we haven't even moved on to those examples in the animal kindgom where consciousness can be linked to similiar behavior. Or are you going to take the view that all animals except humans are mere automatom or machines that only seem to be conscious? That it is only HUMAN consciousness that counts as the real thing? So maybe it should be the AFHC - the argument from human consciousness?

Victor Reppert said...

If I put a couple of pieces of fresh meat out for my dogs, they will most likely come and eat it. The odor will give them real information, and they will change their behavior based on that. Are we going to say that they are experiencing intentional states?

The intentionality that I am immediately familiar with is my own intentional states. That's the only template, the only paradigm I have. I wouldn't say that animals are not conscious, and if I found good evidence that animals could reason it would not undermine my argument, since I've never been a materialist about animals to begin with. Creatures other than myself could have intentional states, and no doubt do have them, if the evidence suggests that what it is like to be in the intentional state they are in is similar to what it is like to be in the intentional state that I am in.

Steven Carr said...

If we derive our consciousness from God, then surely we have derived intentionality, and might well be purely physical systems.

How does consciousness begin?

If a man is unconscious , is he purely physical? How does this physical system regain consciousness? What has to be added to this physical, unconscious man?

Bill Vallicella said...

"If we derive our consciousness from God, then surely we have derived intentionality, and might well be purely physical systems."

Doesn't follow. To say that we derive our consciousness from God is to say that God creates us as conscious beings possessing intrinsic or original intentionality. But to say that a map has derivative intentionality is to say that its aboutness is not a property that it has in and by itself, but involves our imputing that aboutness to it. You are equivocating on 'derive.' God creates us as original meaners, not as symbols to which he then imputes a meaning.

Victor Reppert said...

Steven: There is a sense in which, for a theist, our intentionality is derived, if the fact that we have intentionality is explained in terms of God's creating us in such a way that we possess intentionality. On the other hand, a theist does not have to maintain that, for example, the meanings of words are determined by some act of God as opposed to being determined by the way we use them. The meaning of the word "rook" is determined by how it is used by humans in the game of chess, not by divine fiat.

The statement "No conscious beings are completely physical beings" does not entail "All non-conscious beings are purelyl physical beings." That would be to commit a logical fallacy.

Steven Carr said...

Victor writes that it need not be true that a non-conscious person is not purely physical.

What is there about an unconsious man that is not physical?

And how does an unconsious man regain consciousness?

Steven Carr said...

To follow up my question.

An unconscious man lacks intentionality. How then does a system lacking original intentionality gain original intentionality?

Steven Carr said...

Bill writes ' To say that we derive our consciousness from God is to say that God creates us as conscious beings possessing intrinsic or original intentionality.'

If God can create conscious beings possessing original intentionality, could we also create such beings?

How did God create us as beings possessing original intentionality?

We are descended from beings that at one stage lacked original intentionality.

Was there really a being whose biological parents lacked original intentionality, but who was endowed by God with original intentionality?

Such a being would have been virtually indistinguishable from its parents, so how would its behaviour prove that it had original intentionality , while its parents behaviours showed that they did not?

Ahab said...

Victor wrote: If I put a couple of pieces of fresh meat out for my dogs, they will most likely come and eat it. The odor will give them real information, and they will change their behavior based on that. Are we going to say that they are experiencing intentional states?

If a man sees a beautiful woman and becomes aroused from the real information that the sight of her gave him, is he experiencing an intentional state?

The intentionality that I am immediately familiar with is my own intentional states. That's the only template, the only paradigm I have. I wouldn't say that animals are not conscious, and if I found good evidence that animals could reason it would not undermine my argument, since I've never been a materialist about animals to begin with. Creatures other than myself could have intentional states, and no doubt do have them, if the evidence suggests that what it is like to be in the intentional state they are in is similar to what it is like to be in the intentional state that I am in.

Having experience of an intentional state is not the same as explaining how you are able to have that state. That really is what neuroscientists who study the brain are trying to do. To say that such states are caused by the soul or some kind of immaterial mind is useless unless you can also provide some model that explains how it is possible for it to do this. Even if there was an intelligent designer, such information is useless for helping us to understand how the mind is able to do the things it does - like experience intentional states.

How would you explain something like blindsight with your assumption that consciousness is a unity? There we have someone who can see things without being aware that they can see. I've just ordered the book "Sight Unseen: An Exploration Of Conscious And Unconscious Vision " by Goodale and Milner about a woman who suffered from such a condition. From reviews of the book I've read, the authors conclude that the visuomoter part of vision occurs at the unconscious level. They may or may not be correct in their conclusion, but it seems like they at least have the beginnings of an explanation for something like blindsight. An explanation that also helps us to understand a little better how the brain uses vision.

Victor Reppert said...

The context of my discussion of intentionality was that the terms used by Carrier to describe intentional states seem to me to be terms that presuppose both intentionality and consciousness.

Victor Reppert said...

A man aroused by a beautiful woman is in various intentional states, however, the condition of being aroused is not, on my view, an intentional state. It is the same thing with being hungry. A person who is hungry invariably has all sorts of propositional attitudes related to the hunger, however, the hunger is not an intentional state.

Ahab said...

Victor wrote:
The context of my discussion of intentionality was that the terms used by Carrier to describe intentional states seem to me to be terms that presuppose both intentionality and consciousness.


I'm not sure I understand you here. I think we both agree that humans are conscious and experience intentional states. I'm also assuming that we are able to share (at least to some degree) the content of each other's intentional state through language. I'm not sure how any term used to describe those states wouldn't presuppose intentionality. Doesn't the mere use of language presuppose intentionality?
I must be missing what you are saying here. Or perhaps we have simply been talking about different things? I'm puzzled.
Lewis' claim that:"To talk of one bit of matter being true about another bit of matter seems to me to be nonsense." seems to me to be refuted by the air dance of the bee being about where the good source of pollen is to be found. It is certainly not ententionality as you and I experience it. The bee isn't consciously thinking about all this and hoping that the other bees have gotten the message. But that dance certainly corresponds to the location of the pollen. What else is truth but an accurate correspondence to reality?

Steven Carr said...

'Lewis' claim that:"To talk of one bit of matter being true about another bit of matter seems to me to be nonsense."'

Surely Lewis's comment is true.

Only propositions can be true or false.

And a proposition is not a bit of matter.

However propositions can be encoded in symbols, and symbols can be manipulated purely mechanically to produce new true propositions, without the manipulator being aware of what the symbols mean.

Naturalists think that there is only manipulating of syntax going on in a human brain, and out of this syntax comes semantics.

Steven Carr said...

To follow up my question again:-

'An unconscious man lacks intentionality of any kind. How then does a system lacking original intentionality gain original intentionality?'

What is the dualistic view of how a unconscious man regains intentionality and consciousness?

Did the soul decide to restore consciousness?

Unless you posit that the soul has a consciousness and intentionality that the unconscious man is totally unaware of, I don't see how dualists can tackle this problem.

Victor Reppert said...

Some extreme forms of dualism emphasize the independence of the nonphysical mind from the brain, while others, like Hasker's Emergent Dualism, emphasize the close relation between mind and brain. A computer that has been shut off is still a computer.