This is a response by Carrier admirer Ben Schuldt to an early response of mine to Carrier. He uses the fallacy of composition charge against my argument.
Physicalist analyses of mental start by defining the physical by excluding the mental, but then combinations of the physical are supposed to be mental. Yet, when the physical descriptions are complete, it looks as if the marks of the mental have disappeared and have been replaced by something that doesn't look mental at all. I call this changing the subject, but Schuldt thinks that I am question-beggingly insisting on a "magical" analysis of mind. I say I am insisting on a mentalistic analysis of mind. The mental is what it is, and is not something else.
It doesn't seem that all part-to-whole inferences commit the fallacy of composition. For example, if every part of the shed in the back yard is made of wood, then the shed is made of wood, isn't it? (Even if it doesn't weigh the same as a duck).
27 comments:
Victor,
Isn't this the issue--once you admit that some of the inferences to part to whole are fallacious, you have to wonder what criteria distinguish the reliable inferences from part to whole from the unreliable inferences. Until then, it's hard to see what the implications of materialism/physicalism are. Until then, it's hard to see why we should agree with your assessment of what the implications of materialsim/physicalism are.
Does the AFR require people to have minds? It seems to me that all it requires is some method of getting around the problem of random chance creating rationality. It seems that can be handled with physical persons created by God, though God's existence would prove materialism wrong it is compatible with humans who don't have mind but are part of a process intended to produce rationality. I haven't read your book, Victor, so maybe you address this but it seems a lot of criticisms of AFR revolve around attacking the concept of minds in people.
Clayton is right. After all, the materialist wants to say that consciousness is analagous to digestion: we don't find digestion in physics, but that doesn't mean it isn't real! Why would we expect the materialist to posit psychons at the basic level? We would expect them to do the exact opposite!
I admittedly take my dualism as basically axiomatic. Experiences just seem different from brain states. Show me the materialist theory that can disabuse me of this intuition, that can explain how this neuronal electrical storm is conscious, and I will become a vulgar materialist wrt consciousness.
That is the basic intuition that divides the dualists from the vulgar materialists. Do they seem different to you or not? If they do, then that becomes the default position and I want evidence to overcome that default. More than mere "correlations" between brain states and conscious states. Cancer is correlated with prostitution, but that doesn't mean cancer is identical to prostitution.
Clayton, do you think I am being unreasonable (subtract out my overstatement at the end)?
I am a property dualist basically because of an intuition that brain states (no matter how complex the bells and whistles become) are simply different from experiences.
Most naturalists,except the most extreme eliminativists, agree that there is clearly a conceptual difference between our concepts of the physical and our concepts of the mental. Thus, the default among materialists is conceptual dualism.
Everything else is metaphysics.
William: yes, but metaphysics is what we are talking about. If it was just semantics (water/h20) I'd be a materialist about qualia.
I think it's very easy to see what the implications of materialism/physicalism are: they have no response other than the eliminativist strategy, and pleading that if you just give them more time and maybe of you squint...
Sometimes you have to admit something new into your fundamental ontology. Two quarks won't do, and if six quarks does, then go with six quarks. With the mental, we have great reasons to suspect that the material as we conceive it won't do the trick, so the reasonable thing to do is either expand things, or admit they may need to be expanded. Maybe psychons are needed after all. Maybe panpsychism is the right view. Or maybe some other form of dualism.
Anon yes, but materialists rightly want an argument, and ultimately there is an intuition clash here. Not very sexy of a position, not much to fight about, I know. :O
parbouj:
"I am a property dualist basically because of an intuition that brain states (no matter how complex the bells and whistles become) are simply different from experiences."
Do you think this intuition could be a result of the fact that we only experience experiences, not brain states? How can you tell that brain states are different from experiences? I am unable to.
"Anon yes, but materialists rightly want an argument, and ultimately there is an intuition clash here. Not very sexy of a position, not much to fight about, I know."
I think it's a clash of intuitions the way "if you have nothing but solid black cubes, you can't arrange them to make a rainbow" is an intuition.
Anon asked:
"Do you think this intuition could be a result of the fact that we only experience experiences, not brain states?"
I don't understand the question. I experience sunset, brain states, etc.: experiencing an experience (a 'meta' experience? is a strange idea to me).
" How can you tell that brain states are different from experiences? I am unable to"
They just seem different. A lightning storm is not conscious. Why should I think the lightning storm in my brain is any different?
As I said, this is just an intuition, but I can't pretend it isn't there. It seems (again I don't have a strong argument) that no matter how you arrange the atoms, move them around, put in some currents or voltages, none of that is conscious, and no combination of such properties is going to be conscious.
But the pure materialist will say, 'OK, why do you believe a bunch of moving atoms can digest, but cannot be conscious?' and I will just admit it is ultimately axiomatic for me. I have seen no really conclusive arguments for my position (many good massagers of my intuition though, such as Mary).
"But the pure materialist will say, 'OK, why do you believe a bunch of moving atoms can digest, but cannot be conscious?' and I will just admit it is ultimately axiomatic for me."
First, digestion is entirely a third person property, and involves no first person properties.
Second, what happens to "digestion" on pure materialism is that we label this or that motion of atoms as "digestion". So this or that motion is digestion in virtue of our minds classifying it to be so. Which is precisely why using that trick on our minds becomes such a hassle.
Anon I don't think it is a matter of convention what digestion is. The paramecium digests even if we don't know it.
And, "First, digestion is entirely a third person property, and involves no first person properties."
But the question is whether "first-person" properties can be neural too (I don't like to put it that way, as it is using a grammar category to make an ontological point which is dicey). Can a brain be sufficient for subjectivity?
I am familiar with pretty much every argument, I'm just saying that none of them work well without a core intuition that brains and experiences are different.
I just try to be more basic, flat-footed: here is my intuition, everyone I know has this intuition. Give me good reason to overthrow it, tell me how a brain might be conscious, might be subjective, or have "first person" properties. Unless you can, I am not going to ignore this basic observation that they just seem different.
OK Peresezo let's hear an argument you think is worthy. Bring it, mr talk. Enough name-dropping. Let's see you reason with me. I bet you will not, because you cannot keep up. You throw out put-downs in lieu of arguments because your brain is only four years old, a cognitive abortion left for dead on the side of the road many years ago.
Poor thing. Still trying to pass that communications class at Northern Essex Community College? lmao
"It doesn't seem that all part-to-whole inferences commit the fallacy of composition. For example, if every part of the shed in the back yard is made of wood, then the shed is made of wood, isn't it? (Even if it doesn't weigh the same as a duck)."
There is more to the shed than its material parts; there is also the idea of 'shed' and the planning and labor that went intomaking that particular shed.
Thanks peresozo for making my point.
You. Got. Nothin'.
I find that I can take any neuroscience finding about the brain and place it comfortably into either a monist or a dualist framework, with equal ability to say what topics may be next to research, from an empirical standpoint.
By default we are all conceptual dualists about the mind: we sometimes use the concept of brain and concept of mind as if they were the same, but more often we do not.
The materialist thus has a conceptual gap, which because of their devotion to the old 20th century historical concept of physical monism they fill with hand waving at the brain's complexity.
The substance dualist has an easy time with fitting mental concepts into her world picture, but is completely reduced to mere hand waving where mind's causality must be fit into the physical.
It all seems uncomfortably like two bad (entirely un-helpful) kinds of metaphysics to me. Perhaps a vague kind of neutral monism, where almost no details are specified, would avoid fruitless speculation?
"... into her world picture ..."
Her? Her?!
Oh, I get it, you're one of those emasculated academic types who isn't man enough to speak/write in proper non-Marxist English.
Conventions change, illion.
That too is entirely un-helpful.
William,
"gender inclusive language" isn't a mere linguistic convention; it's politics.
And intellectual dishonesty, which is what you are presently displaying, is most unhelpful.
Look at it this way --
You have willingly chosen to surrender to the Marxist-led "convention" to use "she" when correct English grammar calls for "he" -- though, I'd bet you'dnever refer to a generic murderer as "she" -- and, in response, I have adopted the "convention" of mocking your willful behavior.
Ilion: it is politics, and you have made it quite clear where you stand with respect to the question of the equality of women. That you would harp on such an irrelevant little quibble is classic Iliotroy!
William makes a good point.
Hi Victor,
"Physicalist analyses of mental start by defining the physical by excluding the mental"
See, I don't know where you get that. Can you quote me? Clearly we start with mental events and seek to understand them. They might be magic or they might be mechanical. But when we do so we get accused (regardless of what we say or actually do) of throwing them out. It seems to me you've just contrived that out of no where to suit your own intutions.
Apparently most dualists seem to think that just acknowleding mental events means you have to refute dualism first as opposed to not knowing one way or the other and then weighing the arguments for and against from a relatively neutral standpoint. That's why Darek Barefoot attacked bias (and why I had to address it similarly), because he intuitively seems to know that he's biased and won't be being fair. He assumes everyone has to be equally unfair for some reason.
Dualists have to disregard the self-reporting of everyone who doesn't share their intuitions and those who also do not necessarily have physicalist intuitions. They may live their whole life not thinking about it and then one day philosophers (who bother to think about these things) stump them with the question and they don't know the answer even from their first person perspective. There are people in this world who can set aside their intuitions and not formulate conspiracy theories on the part of those who do not share them and there are those that simply can't despite everyone's protests. You may say, "I honestly can't do it," but you really need to consider it may just be a personal failing.
"It doesn't seem that all part-to-whole inferences commit the fallacy of composition."
Duh, which is why I say the primary AfR incredulity is *indistinguishable* from from a fallacy of composition. It could turn out to be right in the end.
Obviously one can say, "I put 5 good people on the soccer team, so the whole team will be good" and that can turn out to be perfectly correct. On the other hand, 5 good soccer players may be individually good yet happen to have their own way of doing things that clash with each other and thus make a bad team. It can go either way and so the inference needs to be understood as non-absolute.
The philosophical point is that rationality arising from non-rationality is not inherently impossible on the face of it and the dualist objection if it is to be a knockdown argument against naturalism is a fallacy of composition. The dualists may still have a correct inference in the end. Otherwise, that would be the "fallacy fallacy."
And the reason I can leave the argument here is because the intuitions naturalists seem to be up against in their critics (and hence 95% of the debate) only amount to an inference which could easily be a fallacy of composition. Hence the door *is* open for all the physicalist evidence that dualists overtly ignore. Insert the massive amounts of mutually intersecting evidences that mental states correspond to physical states and note the lack of evidence of anything to the contrary.
Ben
Since there were no links to earlier blogs and comments, I can only refer to what has been stated here. Nevertheless, I am familiar with Ben's position on this. I have no presumptions about the Victor's.
If the argument is whether or not we can have physical explanations of the mind that preclude the mental, I would also agree, if this be Victor's argument, that the mental does possess an irreducibility. This does not mean that there isn't anything physical going on, or that the mental stands above the physical as something "magical." I would agree and argue along the lines that John Searle does on this subject: something substantial is lost when you reduce the mental to completely physical descriptions.
We could go another route, however. Look at what the Churchland's say with regard their eliminative materialism. The things we describe as mental simply don't exist. There is no argument that "when the physical descriptions are complete, it looks as if the marks of the mental have disappeared ..." The marks are simply denied outright. The stuff of "thoughts" don't exist because "thoughts" don't exist.
I think there can be a common ground. It has to do with the fact that, in an ontological sense, the mental does not exist. It does not exist in the sense that the economy does not exist qua the economy. Instead, the economy is something abstracted and sustained by the social behavior of its constituent parts. Similarly, the mental qua mental cannot be so described because things like "thoughts" do not exist. That does not mean they don't play a crucial role in our existence. The question ultimately becomes not one of ontology but one of semantics. If you reduce the mental to the material, you end up with nothing mental. The question is if that is the right way to -describe- the mental (i.e., as material). This question is purposely vague. The "right" way depends. If we're being scientific about our brain processes, then there is good reason to view the mental in terms of material. On the other hand, if we're trying to understand the content of the mental qua mental, then we need a -language- (semantics) of the mental. This requires that irreducibility Searle talks about.
To return to my economics analogy, while you could talk about the economy in terms of individual people's behavior, you lose the content of talking about the economy qua economy. Instead, you're talking nonsense that really doesn't say anything about the economy. So either we face the dilemma of wanting to talk about people and their interactions or we want to abstract to these broader notions for which the language of economic serve. In this regard, the analogy parallels that of the mental and the material.
Hey Bryan,
"...something substantial is lost when you reduce the mental to completely physical descriptions."
I don't actually see it like that. The word "reduce" (for me at least) aims at the intellectual exercise of understanding what we are talking about. However, the persistent meta-pattern of personhood and consciousness are what they appear to be at their own level of things. So, there's really nothing "lost" if you make that distinction, and it seems like a justified distinction to make.
It may be quite traditional in the philosophy of mind to allow semantics to get mixed up with erronious value judgements and bizarrely "eliminate" what clearly exists in its own way. However, people refer to all sorts of things at their level and not always by atomic parts. I refer to atoms at the expense of ignoring their neutrons, protons, and electrons. And when I refer to those I'm not concerned necessarily about their quarks.
So I see nothing inconsistent with referring to thoughts as "real thoughts" even though I believe they are ultimately made up of atoms.
Ben,
That's just it, though. We understand the reduction, at least in part (via The Standard Model), of atoms to their subatomic parts. Similarly, we have theories that explain the reduction of elements to atoms or human bodies as composites of molecules of various sorts.
There are, however, things lost in this reduction. For instance, (semantic) properties such as "hardness" do not exist in terms of atoms as they do of tables. Of course, we have a translation: the hardness of a table is a certain structure of its atoms. The question is, do all properties have a translation? If not, is that due to lack of explanation (theory) or lack of ontology?
The Churchlands, if my superficial understanding of their work is correct, take the position that there is a lack of ontological translation between (semantic) entities like "thoughts" and that of neuroscience. Of course "thoughts" don't exist in the brain just like "hardness" does not exist of atoms. This position is more strict. It is that the translation does not exist. Thus, thoughts themselves do not exist. They are a semantic fantasy to describe stuff that does exist. The same applies to other mental 'stuff,' including intentionality and consciousness. What we believe we perceive of the semantics of the mental is flatly false. I'll ignore the case in which a lack of translation is due to lack of theory; that's just ignorance that can be overcome in principle.
I alluded to Searle. He takes a more common ground position that I feel is apt. I would argue that his theory suggests the translation exists--mental states are brain states. This is just not a one-to-one translation, and when it comes to our understanding of the semantic relations that define mental states and mental processes we should view them in terms of mental states and mental properties. This is what I alluded to when I used the economics analogy. Some want to think we can describe economic relations in terms of psychology, neurobiology, or even physics (if the reductions are possible, which hypothetically we can assume they are). While that may be, in principle, possible, this begs a question that what we're talking about with "economic relations" is maintained. I would argue that the substance of those relations is lost. This argument requires a much broader argument regarding logic.
To give it a framework, suppose we have two choices. We can think of a function y = f(x) + g(x) + h(x). But since we know those functions, we can just substitute the "translation" of the function to y = (x) + (x*x) + (x*x*x). This is what is agreed upon. This is maintained. What is not agreed upon is beyond the technicalities of such a substitution. In logic, Frege called this "sense". The "sense" of a logical statement goes beyond the -syntax- of the language used. It has to do with our comprehension of the semantic-less terms used. Once semantics are included, there is a translation, and something very subjective introduced (though not necessarily incomprehensible or systematic). This is where the irreducibility of mental states to brain states is introduced. It is not ontological. It is epistemological; it is semantic.
If we think of this in simplistic terms of "hardness" used before, the question is whether we can describe physical properties (of, say, tables) and processes in terms of those reductions. Even if it is possible, is it appropriate? I would argue that our very scientific practice affirms the position I'm arguing. In principle, neurobiology is physics, but we don't just do "physics of the brain." There is a semantic framework that is "neurobiology." This does not preclude lower-level descriptions to be used when appropriate or suggest that the brain is sustained by something other than physics. It just means that there is a semantic irreducibility to something so abstracted from its lower-level description. When we get to even higher abstractions as psychology or the economy or politics, the irreducibility becomes even more important.
Now, I agree that there is an importance it identifying these translations. As I said, this is about epistemology. It is important to be able to identify what things are actually real (ontologically) and which are not. It is important to know which things are ontologically irreducible (i.e., because they don't exist). But it also raises another issue I did not mention: sometimes it is important to have unrealistic things. I point this out because in fields like politics and economics, there are theories that describe and explain political and economic relations that we know are just flat out unrealistic. Yet, even an unrealistic model can be useful. This is not (usually) seen as a problem with the theory. There's also the problem that can be faced in that translations or reductions do not necessarily have to have a reduction at each "level." Again, making a scarce tangent to logic, it would be like saying we can reduce a 3rd-order logical term to a 1st-order, and another to a 2nd-order, but it is not necessarily the case that every term can be reduced to a 1st-order. The extent of these "orders" or "levels" is unclear, and to me it is just a way of saying that some things are more or less abstracted than others. Not everything in those abstractions have a complete reduction or reduce in the same way. Some things are lost "all the way down" while still being reducible "part of the way down." What would we call this? Pseudo-(ir)reducibility?
Well, obviously we don't see wavelengths or frequencies; we see what we call "color." So it appears the brain is generating its own theater of things, like hardness, and such for equitable processing purposes. I don't know why that wouldn't be lumped in with our "reductionistic" understanding of conscious computation.
Post a Comment