I still think his argument doesn't work as it again, is based on the following: But (6) is absurd (self-contradictory): it amounts to saying that it is logically possible that the very criterion of logical possibility, namely LNC, be false.

I think the LNC is sometimes false, as I said in the previous post, as the violation of the law of the excluded middle in quantum mechanics implies.

Again, I agree with his conclusion (slightly restated here) that what is true in some axiomatic system is not just an empirical generalization. I just don't think his argument works.

Also, the fact that we don't even know if standard axiomatizations of arithmetic are consistent (from Godel) means it is impossible to prove that there aren't any contradictions to be derived even in such non "deviant" cases.

I think some axiomatic systems are more useful than others. Determining which will be most useful is an empirical thing. However, each axiomatic system does generate a set of possible inferences and impossible inferences. They are not empirical generalizations. So I agree with his conclusion, but think the metaphysical consequences are banal.

Note, just so I don't paint myself as a wishy-washy postmodernist or something.

While I usually think reductio ad absurdum arguments are fine, I think that in certain contexts, like arguing about foundations of logic/mathematics, that we need to be more cautious about throwing around terms like "logically true", and assuming this means 'doesn't violate LNC'.

More directly to his argument, I would say that, while the laws of logic could be empirical generalizations that are incorporated into some axiomatic system, this would not mean that in that system it is possible for P to be false. In some other system, P could be false, i.e., it could be logically possible.

There are so many ways around his argument it is hard to choose what to pick on. (But again, I think his conclusion is right).

Gila Sher's book The Bounds of Logic and Kitcher's book on Mathematical Knowledge are better than I could hope to be on these topics.

Finally, I want to make clear that I am not all that sure if my arguments are sound. When I said "There are so many ways around his argument" I came off too strong: I should say that there are lots of possible ways around the argument. However, these issues are complicated, and to confidently claim that any of these possibilities really works would probably require a couple of years of study for each one.

I think, for the naturalist, there are a few ways to account for mathematical knowledge. One, the old Quinean story which I've been pushing. Mathematics (and logic as a subset) is something we find pragmatically useful to model the world, and so such systems evolve with our understanding of the world, are sensitive to empirical facts (as in Putnam's 'Is Logic Empirical'). I think a lot of the QM-logic types take this tack (for a recent quantum logic paper see this link.

Whether one can be a realist about mathematical knowledge in this case is a good question, and I tend to be a nonrealist about mathematical statements (I think mathematical truths are parasitic on cognitive systems and empirical truths about the world, which is different from saying that mathematical truths are empirical generalizations).

Another alternative (not mutually exclusive) is that mathematics and logic reflect basic cognitive operations, which, in a Kantian-like way, we take as basic and without which we could not make sense of the world. Perhaps it isn't possible for us to conceive of a world where the LNC fails (though Priest's recent book argues against this in interesting ways).

At any rate, while I am not convinced by Vallicella's brief argument for technical reasons, I think if he expanded it into a book (which is what the topic deserves) he might be able to make an excellent case.

## 3 comments:

I still think his argument doesn't work as it again, is based on the following:

But (6) is absurd (self-contradictory): it amounts to saying that it is logically possible that the very criterion of logical possibility, namely LNC, be false.I think the LNC is sometimes false, as I said in the previous post, as the violation of the law of the excluded middle in quantum mechanics implies.

I discussed this a bit more in a response to Tim McGrew below.

Again, I agree with his conclusion (slightly restated here) that what is true in some axiomatic system is not just an empirical generalization. I just don't think his argument works.

Also, the fact that we don't even know if standard axiomatizations of arithmetic are consistent (from Godel) means it is impossible to prove that there aren't any contradictions to be derived even in such non "deviant" cases.

I think some axiomatic systems are more useful than others. Determining which will be most useful is an empirical thing. However, each axiomatic system does generate a set of possible inferences and impossible inferences. They are not empirical generalizations. So I agree with his conclusion, but think the metaphysical consequences are banal.

Note, just so I don't paint myself as a wishy-washy postmodernist or something.

While I usually think reductio ad absurdum arguments are fine, I think that in certain contexts, like arguing about foundations of logic/mathematics, that we need to be more cautious about throwing around terms like "logically true", and assuming this

means'doesn't violate LNC'.More directly to his argument, I would say that, while the laws of logic could be empirical generalizations that are incorporated into some axiomatic system, this would not mean that

in that systemit is possible for P to be false. In some other system, P could be false, i.e., it could be logically possible.There are so many ways around his argument it is hard to choose what to pick on. (But again, I think his conclusion is right).

Gila Sher's book The Bounds of Logic and Kitcher's book on Mathematical Knowledge are better than I could hope to be on these topics.

Finally, I want to make clear that I am not all that sure if my arguments are sound. When I said "There are so many ways around his argument" I came off too strong: I should say that there are lots of

possibleways around the argument. However, these issues are complicated, and to confidently claim that any of these possibilities really works would probably require a couple of years of study for each one.I think, for the naturalist, there are a few ways to account for mathematical knowledge. One, the old Quinean story which I've been pushing. Mathematics (and logic as a subset) is something we find pragmatically useful to model the world, and so such systems evolve with our understanding of the world, are sensitive to empirical facts (as in Putnam's 'Is Logic Empirical'). I think a lot of the QM-logic types take this tack (for a recent quantum logic paper see this link.

Whether one can be a realist about mathematical knowledge in this case is a good question, and I tend to be a nonrealist about mathematical statements (I think mathematical truths are parasitic on cognitive systems and empirical truths about the world, which is different from saying that mathematical truths are empirical generalizations).

Another alternative (not mutually exclusive) is that mathematics and logic reflect basic cognitive operations, which, in a Kantian-like way, we take as basic and without which we could not make sense of the world. Perhaps it isn't possible for us to conceive of a world where the LNC fails (though Priest's recent book argues against this in interesting ways).

At any rate, while I am not convinced by Vallicella's brief argument for technical reasons, I think if he expanded it into a book (which is what the topic deserves) he might be able to make an excellent case.

Post a Comment