top of page

ABSTRACTS /

Ofra Magidor, "Copredication and Counting: Dual Nature vs. Property Versatility"

(joint work with with David Liebesman)

​

Abstract: Imagine a bookshelf containing three copies of War and Peace and nothing else. In one sense, it seems that there are three books on the shelf. In another sense, it seems that there is only one book on the shelf. A natural way to account for these diverging counting statements is to assume that ‘book’ can receive multiple readings: on one reading ‘book’ picks out a kind of physical object (of which there are three) and on another, it picks out a kind of informational objects (of which there is one).


However, this natural solution comes under pressure when we consider cases of copredication. For example, there are true readings of ‘The red book is informative’, even though on the face of it, only books in the physical sense can be red, and only books in the informational sense can be informative.

In this talk, I will compare two main approaches to this problem. The first is the dual nature approach, according to which ‘book’ does not receive multiple readings, but always picks out a kind of complex object consisting of both physical and informational components. The second, is our own favoured approach, on which ‘book’ does receive multiple readings but copredication can be explained by the realization that properties are highly versatile: both physical books and informational books can share properties such as being informative or being on shelves.

The focus of the talk will be on the semantics of counting statements. I argue that data cases that are accounted for straightforwardly by the property versatility approach, pose a serious problem to dual nature approaches. Back to Schedule

Wes Holliday, "Escaping Arrow's Theorem"

Abstract: There is an extensive literature in social choice theory studying the consequences of weakening the assumptions of Arrow's Impossibility Theorem. The general moral of this literature is that there is no escape from Arrow-style impossibility theorems unless one drastically violates the Independence of Irrelevant Alternatives (IIA) or rejects the transitivity of the social "at least as good as" relation. In this talk, based on joint work with Mikayla Kelley (Stanford), we challenge this general moral. We propose a model of comparing candidates in elections, which we call the Advantage-Standard (AS) model, which explains why one should not normatively expect full IIA to hold but rather a weakening of IIA known as weak IIA (two profiles alike on x,y cannot have opposite strict social rankings of x and y). We then prove a possibility theorem: there is a transitive collective choice rule satisfying weak IIA, the strong Pareto principle, the Pareto indifference principle, anonymity, and neutrality. The key to this possibility theorem under weak IIA is that we drop the requirement of completeness of the social "at least as good as" relation, which one should not expect to hold on the AS model and is not required to ensure that any feasible set of candidates has a nonempty choice subset of maximal elements. While Baigent (1987) and Weymark (1984) showed that only weakening IIA to weak IIA or only dropping completeness still leads to impossibility theorems, we show that jointly weakening IIA to weak IIA and dropping completeness leads to no such impossibility theorems for transitive collective choice rules. Finally, we not only drop completeness but also argue that it may be appropriate to handle differently non-singleton choice sets due to indifference and non-singleton choice sets due to noncomparability. Back to Schedule

Andrew Bacon, "Vagueness at Every Order"

Abstract: There are some properties, like *being bald*, for which it is a vague, and thus unknowable, matter where the boundary between the things that have it, and the things that do not, lies. A number of argument threaten to show that such properties can still be associated with determinate and knowable boundaries: not between the things that have it and those that don't, but between the things such that it is borderline at some order whether they have it, and the things for which it is not.

I argue that these arguments, if successful, turn on a contentious principle in the logic of determinacy: Brouwer's axiom, that every truth is determinately not determinately false. Other paradoxes which do not appear to turn on this principle often tacitly make assumptions about assertion, knowledge and higher order vagueness. In this talk I'll show how one can avoid sharp higher-order boundaries by rejecting these assumptions. Back to Schedule

Harvey Lederman, "Something Isn't Something"

Abstract: Recent discussions of propositional fineness of grain typically start by setting aside propositional attitudes, distinguishing questions about "representations" from those about "reality". The first half of this talk calls this increasingly common move into question. I begin with an argument in the vicinity of the so-called "problem of logical omniscience." I argue that this argument does not pose any kind of general "problem," but is simply a sound argument for a fine-grained theory of propositions. The second half of the talk develops such a fine-grained theory. The opening argument of the talk motivates a new principle governing propositional fineness of grain, which I call "Closed Structure." This principle is consistent with classical quantification theory and material beta-equivalence in higher-order logic. But this package has a surprising consequence: it entails the existence of certain ineffable properties. To avoid this consequence, I propose that, like "Pegasus", certain higher-order quantifiers do not denote anything. In a slogan, "something isn't something." I prove the consistency of a fine-grained theory which embraces this idea, together with Closed Structure and material beta-conversion. This new package does not imply the existence of ineffable properties. I close by considering the relationship between the new theory and sententialism, and revisit its implications related to the content of propositional attitudes. Back to Schedule

Sarah Moss, "Knowledge and Legal Proof"

Abstract: Contemporary legal scholarship on evidence and proof addresses a host of apparently disparate questions: What does it take to prove a fact beyond a reasonable doubt? Why is the reasonable doubt standard notoriously elusive, even sometimes considered by courts to be impossible to define? Can the standard of proof by a preponderance of the evidence be defined in terms of probability thresholds? Why is merely statistical evidence often insufficient to meet the burden of proof?

​

This paper defends an account of proof that addresses each of these questions. Where existing theories take a piecemeal approach to these puzzles, my theory develops a core insight that unifies them—namely, the thesis that legal proof requires knowledge. Although this thesis may seem radical at first, I argue that it is in fact highly intuitive; in fact, the knowledge account of legal proof does better than several competing accounts when it comes to making sense of our intuitive judgments about what legal proof requires. Back to Schedule

Bernhard Salow, "Dogmatism and Belief Revision"

Abstract: Kripke’s dogmatism puzzle is an apparently compelling argument for the absurd conclusion that, if one knows P, one should avoid or ignore all evidence against P. I will argue that the key to solving this puzzle lies in the theory of belief revision. In particular, the solution depends on whether we accept Rational Monotonicity, the principle that if we initially believe P while leaving open Q, we should continue to believe P once we discover that Q is true. For either this principle is true, and the argument breaks down for one reason; or this principle is false, making space for a different solution. Back to Schedule

Michael Caie, "Context Dynamics"

Abstract: In this paper, I consider how, given mutual knowledge of the information codified in a compositional semantic theory, an assertion of a sentence serves to update the shared information in a conversation. There is a standard account of how such conversational updating occurs. While this account has much to recommend it, in this paper I argue that it needs to be revised in light of certain patterns of updating that result from some natural discourses. Having argued for this, I present a new account of conversational updating that can be seen as a natural generalization of the standard account, and show how it can predict these patterns in a simple and principled manner. Back to Schedule

Please reload

Magidor
Caie
Moss
Bacon
Salow
Lederman
Bacon
Holliday
bottom of page