Tuesday, November 08, 2005

Steven Carr on Derived Intentionality and Computers

This is a response to one of my old posts from Steven Carr, that appeared on the Infidels forum. I just found it recently. I'm going to let some other people take a crack at this one.

This is where it is on Infidels: http://www.iidb.org/vbb/archive/index.php/t-126163.html

In http://dangerousidea.blogspot.com/2005/05/argument-from-computers.html Victor Reppert writes :-
'The intentionality found in the computer is derived intentionality, not original intentionality.'

A long, long time ago people thought there was something special about organic chemicals, until the first organic chemical was synthesised by a human.

Victor's blog strikes me very much like somebody claiming that there was still somthing special about organic chemicals, as the organic chemical was created by a human , and so derived, not original.

Victor's point is true, but irrelevant surely.

The point is that a purely material thing can manipulate very abstract non-material things (software classes, pointers, variables etc).

If God wanted to create us as purely material creatures, but still with intentionality, then He could do so.

Perhaps Victor would be right and our intentionality would be 'derived', rather than 'original', but that does not refute a claim that God had created us as purely material things, just the same as *we* can create purely material things that can manipulate non-material objects.

So the existence of computers refutes a claim that God cannot create purely material human beings.

If Victor wants to prove that God cannot create a purely material human being , then he needs another argument to the ones he is using.

16 comments:

shulamite said...

The computer is a tool. Like all tools, it does rational things because it is moved by reason, the same way that a chisel cuts the statue, or trap catches a bear.

For the sorts of people who prefer the historical account, this account of a computer is common to any new and highly developed machine- men would have once carved out a "derived intentionality" for a watch too. Why not just stick with the words we have though, and simply notice that the computer is simply a tool, which like any tool, has man as its first efficient and final cause?

shulamite said...

These sorts of arguments are typical. Ten thousand opinions, and no definitions.

No solution to these problems ever arises b/c people never bother to give an account of matter/ material, or of knowledge. Such definitions are impossible anyway, unless one has prior knowledge of matter and form, act and potency, the distinction of causes, etc. The modern academy is hellbent against proceeding on the ground of any principles, and even very good-hearted philosophers are degraded by this bad habit.

If Steve Carr were to try to define "matter" (that from which something comes to be) or "knowledge" (to be another as other) or "tool" (a moved mover/ instrumantal cause) or "art" (right reason in making). He would realize his position was not as enlightened as he imaginines it to be.

shulamite said...

I'm sorry, just one last thing, I should have thought this all out more clearly first.

The issue being discussed here is a part of the science of Natural Philosophy, the part dealing with intellective being. The difficulty is that people are trying to discuss the question without the rest of the things in the science, which is no different from trying to learn calculus before learning the multiplication table. Why should we expect to solve these questions apart from the science they are found in? The real debate should be over the sorts of things that people never want to talk about any more: the division of the sciences, hylemorphism, analogy, being, motion, material/matter, prime matter, causality, etc. figure these things out first, then we will have something informed to say about rational being.

Steven Carr said...

Can Shulamite come up with a non-question begging proof that an omnipotent God cannot create a material thing that has intentionality?


Why should I give precise defintions of matter, when Victor and I both agree that matter exists?

Do I need to teach Victor what matter is? Hardly.

Our disagreement is that supernaturalists say that certain supernatural things also exist in adddition to the things that both naturalists and supernaturalists agree on as existing.

Mike D said...

I like the computer analogy. The naturalist would say that the computer is an example of a purely physical thing having the capability to make decisions toward goals (intentionality). The software would be instincts plus learned behavior. Steven's question is whether God could create a purely material being capable of intentionality. It seems that the answer is clearly "yes". The animal kingdom provides a fairly complete array of material creatures that appear to survive through the use of intentionality.

The mystery is not so much in the hardware, but the software. A comptuer does not function as a purely physical machine because it is acting based on software created by non-machines. Victor's point about this derived intentionality is critical. The theist will see in the software an argument for a non-material component. It may not be the characteristic of intentionality, but somewhere in the complexity of human thought, conciousness, creativity, kindness... there seems to be more than biological hardware.

Victor Reppert said...

I should point out that the term "intentionality," as I am using it, means aboutness, rather than purposiveness. Could a computer's states be about anything if there were no humans around to impute the about-ness into the system.

Don Jr. said...

Steven Carr, in his original post, using computers as a case in point, says that we can "create purely material things that manipulate non-material objects." Of the bat, we can say that it seems Steven has taken a dualist position. Steven does not explain how a material thing can manipulate a non-material thing (or vice versa) but he maintains that it is possible. This greatly resembles a dualist's response to a charge often brought up against dualism, namely, How can a non-material thing manipulate a material thing? That is, how can the immaterial mind affect the material body? An honest dualist will admit that she is not certain, yet will maintain, given the apparent case that it does happens, and that it doesn't entail any sort of contradiction, that, therefore, it is possible (regardless of the fact that it is inexplicable). Some materialists object to this sort of position. But Steven has adopted a view virtually identical to the dualist's position. Gladly, we thank Steven and welcome all the support for our position we can get.

Secondly, we can ask if Steven is suggesting that computers actual perceive their intentionality? He seems to think so when he says that they can "manipulate non-material objects" (as if they are actually reasoning). Does Steven think that computers have minds, that computers are self-conscious? Does Steven think that computers actually perceive concepts and reason to conclusions, that they actually have intentionality? If so, it is amazing that they never make a mistake. (Maybe we should reward them for right answers and chastise them when they crash.) Steven seems to think that because we perceive the "about-ness" (which originates with us) of the reasoning, concepts, and so forth, that computers are merely displaying, that, therefore, computers do as well. (Many of my teachers have written out complex proofs for mathematical theorems on chalkboards. Does this mean the chalkboard, because it displays the reasoning of the teacher, is also a genius? Is the chalkboard now "manipulating non-material objects"?) Does Steven think a calculator is actually aware of the laws of logic? Does this mean that because Tickle-me-Elmo can talk, that he's a real person? Computers merely parrot back what we have programmed them to do. If a computer has been programmed to say, "I love you," does it now have feelings? (Hopefully that's a rhetorical question.) If the answer to that question is no, then I am still not aware of why Steven thinks that because computers can add that they are "manipulating non-material objects." If anyone views this post as facetious then that merely reflects on the absurdity of the claim that computers have intentionality. Everything I have said in this post is a legitimate question or issue that follows if one is to claim that computers have intentionality.

Mike D said...

Wow. I had to do a lot of reading to catch up on the concept of intentionality. Fortunately, Dee Jay made my point more correctly. A computer is just a big Chinese Room.

I think Steven is wrong about the comptuer, "The point is that a purely material thing can manipulate very abstract non-material things (software classes, pointers, variables etc)." The computer is just manipulating bits and bytes. It is merely the writing on the page that communicates the message from the author to the reader.

Don Jr. said...

"The computer is just manipulating bits and bytes. It is merely the writing on the page that communicates the message from the author to the reader."

Nicely and succintly put, Mike.

Steven Carr said...

'The point is that a purely material thing can manipulate very abstract non-material things (software classes, pointers, variables etc)."'

I am a computer programmer by profession, and I assure you that this is exactly what computers do.

Pick up a book on computer science.

DeeJay writes '"The computer is just manipulating bits and bytes.'

If DeeJay thinks a byte is a material object, he has another think coming. It is a high-level concept, not a hardware concept.

Steven Carr said...

And if something without intentionality can never be given intentionality, except by God, how do people in a coma ever recover?

Rasmus Møller said...

With God's help, I guess.

Don Jr. said...

First of all, I thank you, Steven, for your response. However, it would help if you read all the posts more carefully, mine included. I didn't say, "The computer is just manipulating bits and bytes." I quoted it (from Mike). This is a blog, not a forum, so I'm aware that you don't have to respond to what I say. But if you are going to respond to what I say then (1), as I previously said, quote me correctly and (2) actually attempt to respond to the crucial issues I (and others) raise. You responded to absolutely nothing I said at all, even in my second post, which was merely meant as a complement for Mike. And, more importantly, you gave no comment or mention to anything within my first post, which contained the entirety of my argument. I have no problem with you not responding. That's fine. But if you're going to respond, then at least attempt to answer the crucial issues. I don't know if you are, and I hope that you aren't, but please don't bypass the crucial issues just to save face.

Secondly, to throw out some argument from authority (your own self-acclaimed authority mind you) is not a convincing (or valid) response at all. I truly am happy that you are a computer programmer (by the way my major is electrical/network engineering, so I am also aware of computer science and share your affinity toward it), but to use that as the root of your argument—I study computers, therefore I'm right—is (1) blatantly fallacious and (2) an insult to everyone's intelligence. I honestly am not sure if you thought anyone was suppose to truly be convinced by your statement, "I am a computer programmer by profession, and I assure you that this is exactly what computers do." But if you thought that we were, then that is an insult to everyone's intelligence. Maybe I should say, "I am a Christian, and I assure you that God exists," then we can drop this whole exchange. However, I'd be glad to continue this dialogue with you if you care to respond to what I said in my first comment on this topic. Thanks.

Mike D said...

I still contend that the computer is not manipulating concepts just as a book is not intelegent. Bits and bytes are no more than on/off switchs. The computer programer codes high level concepts into computer programs. Any susequent concepts of aboutness occur only after the program is run by another intelegent person who then recgonizes the intentionality communicated. The computer never gets it. It can be programmed to mimic understanding, but it does not understand.

Steven Carr said...

WhileMike D. is right, it does put me in mind of people claiming that organic compounds were somehow special, and , after a human had synthesised the first organic compound, perhaps defending the idea by claiming that the synthesised compounds were not living.

My post that Victor highlighted was only to suggest this analogy, to see how valid it was.

The more interesting question is whether an omnipotent God can make a conscious computer. Is there something about silicon, as opposed to carbon, which means that a computer can never be conscious, and how can the physical properties of something prevent a God from endowing it with consciousness?

Mike D said...

Steven asks:
"The more interesting question is whether an omnipotent God can make a conscious computer."

This question can be approached from several angles. It is a good question that gets to the heart of the computer analogy. One direction is the question of whether there are limits to omnipotence.

I think the better approach is to jump ahead and agree that one possible solution to the conflict between science and religion is what could be called evolutionary deism. This would be a belief that humas are biological machines that have been given conciousness by a divine being. I call it deism because it supports a view of God where he started the evolutionary process (perhaps guided it) but essentially withdrew from any subsequent involvement. This would support methodlogical naturalism yet answer some of the objections to philosophical naturalism.

I don't know if this is where you want to go with your analogy but this is where I imagine it heading.

I have also been trying to formulate an experiment scenario to test AI. It involves two robots on the moon. How much AI would be required to motivate the robots:
to independently explore and learn about their environment
for one to help the other if it fell into a crater
for the robots to tell ach other about their day
to develop verbal communication
to develop written communication
to question where they came from
to question the purpose of their existence?
What controls would be included in the experiment to verify that these activities were not programming tricks that simulated the behavior but were actual independent decisions?