Mark Wilson (Pittsburgh)
“Beware of the Blob: Cautions for Would-Be Metaphysician”
Salvatore Florio, Ohio State University
“Interpretations of Languages with Unrestricted Quantifiers”
Abstract. Russell-style argument put forward by Timothy Williamson shows that if the quantifiers of a first-order language are to be interpreted as absolutely unrestricted, then one is faced with a dilemma. One can adopt the thesis that semantic interpretations are objects only if one rejects a natural principle about what interpretations there are. According to this principle, a predicate letter in the object-language should be interpretable by any formula in the metalanguage. In this paper I explore the argument and some responses to it. Moreover, I argue that giving up the principle is the preferable way out of the dilemma.
Kurt Holukoff, University of Waterloo
“Highlander: The Logic, or: The Open-Texture and Vagueness of the concept of Counterexample”
Abstract. In this paper the author argues that the concept of counter-example is a vague concept with open-texture. Thus, since in some cases, but not in others, a purported counterexample will be a genuine counter-example to one and the same inference pattern, there must be at least two distinct but equally correct ways to make precise the pre-theoretic notion of validity. Therefore, logical pluralism is the right account of the consequence
Charles Pence, University of Notre Dame
“Toward a New Picture of Logical Reduction”
Abstract. At the core of the positivist account of reduction lies an essentially logical operation, which has been studied extensively in the literature on philosophical logic. I argue that the implicit definition of reduction present in most formal results is substantially flawed when viewed as a component of intertheoretic reduction. Two particular examples of such formal embeddings will be demonstrated which cause substantial problems of interpretation in the context of reduction. I will then attempt to produce a new definition of this logical operation – or, at least, to provide some minimal criteria that such a definition should satisfy.
Wataru Asanuma, Florida State University
“In Defense of Platonic Realism in Mathematics”
Abstract. The opposition between Platonic realism and Constructivism marks a watershed in philosophy of mathematics. Among other things, the controversy over the Axiom of Choice is typical (Section 1). Platonists accept the Axiom of Choice that allows the set consisting of the members by infinitely many arbitrary choices, while Constructivists reject the Axiom of Choice and confine themselves to the set consisting of the effectively specifiable members (Section 2). The Banach-Tarski Paradox deepened the skepticism about the Axiom of Choice. But the Banach-Tarski Paradox is so called in the sense that it is a counterintuitive theorem, as distinct from a logical contradiction or a fallacious reasoning. I argue that we should accept the Banach-Tarski Paradox as a Platonic truth and rejects epistemology based on a mathematical intuition (Section 3). By drawing upon Gödel’s Incompleteness Theorems, I corroborate my view that mathematical truths are of nonconstructive nature. We also have to note öodel’s Incompleteness Theorems show that there are limitations inherent in formal methods (Section 4). The Lowenheim-Skolem Theorem and the Skolem Paradox seem to pose a threat to Platonists. In the light of the Lowenheim-Skolem results, Quine/Putnam’s arguments come to take on a clear meaning. According to the model-theoretic arguments, the Axiom of Choice depends for its truthvalue upon the model in which it is placed. In my view, however, this is another limitation inherent in formal methods, not a defect for Platonists. To see this, we shall examine how mathematical models have been developed in the actual practice of mathematics (Section 5). Finally, I conclude that in mathematics, as distinct from natural sciences, there is a close connection between essence and existence, actuality and possibility. The actual mathematical theories are the parts of the maximally logically consistent theory that describes the mathematical reality (Section 6).
Mark Zelcer, CUNY Graduate Center
“On the History of Mathematical Explanation”
Abstract. The idea that there are explanations in mathematics just as there are in science is gaining currency in some philosophical circles. Part of the evidence for the existence of mathematical explanations comes from the historical claim that mathematicians have always been motivated to seek out explanatory mathematics. Paolo Mancosu offers some evidence from the history of mathematics. I challenge Mancosu’s interpretation of the history and argue that it nowhere shows a mathematical concern for mathematical explanations.
Tracy Lupher, The University of Texas at Austin (Clifton Memotial Book Prize)
“Physical Equivalence and Classical Equivalence, In Algebraic Quantum Field Theory”
Abstract. Wallace (2006) argues that the appearance of unitarily inequivalent representations in quantum field theory is not a problem for the foundations of quantum field theory. The heart of his argument relies on Fell’s theorem and its deployment in the algebraic approach to quantum field theory. This argument is examined in detail and it is proven that it does not apply to the vast number of representations used in the algebraic approach. It is also proven that unitarily inequivalent representations are not another case of theoretical underdetermination. Unitarily inequivalent representations make different predictions about classical-like operators. These results are then applied to the Unruh effect.
Nathaniel Jacobs, University of California at San Diego
“Failing to Explain Time’s Arrow”
Abstract. This paper rejects Tim Maudlin’s recent defense of primitive time direction. The paper distinguishes variant and invariant properties in a universe with a primitive time direction. Invariant properties are independent of primitive time direction. Variant properties do depend on time direction, but are observationally indistinguishable from invariant properties. In conclusion, positing a primitive time direction does no explanatory work.
Michael McEwan, University of Waterloo
“Wallace’s Many-worlds Interpretation: Decoherence and Structure”
Abstract. This paper considers David Wallace’s recent many-worlds interpretation of quantum mechanics and its use of decoherence theory, which is employed in an effort to overcome some of the well known problems facing such an approach. I identify exactly how decoherence is used to explain the world-branching structure his view posits and the role this branching structure plays in explaining the emergence of a classical macroscopic world. It is argued, however, that by using decoherence theory in this way Wallace’s approach distances itself in certain respects from the “simple and elegant” mathematical structure of quantum mechanics, thus undermining one of the major motivations for adopting a many-worlds approach in the first place.
Emerson Doyle and Nicolas Fillion