Ramsey + Moore = God
January 6, 2014 9:59 AM   Subscribe

David Chalmers and Alan Hájek give a one-page argument that the Ramsey test and Moorean reasoning entail that rational subjects should accept that they have the epistemic powers of a god [pdf].

For some background on Moore's paradox, see the section entitled Moore's Problem in the SEP entry for Epistemic Paradoxes.

For some background on Ramsey's test for conditionals, see the SEP entry for The Logic of Conditionals.
posted by Jonathan Livengood (65 comments total) 18 users marked this as a favorite
 
This is great news! (Now all we need are some rational subjects.)
posted by ZenMasterThis at 10:02 AM on January 6, 2014 [11 favorites]


"If I believe p, then p" seems like something only a thoroughly self-deluded person could accept. We all must know of thousands of incidents where we "believe p" and p turns out to be false. I guess this is just pointing to a problem in how we define "believe."
posted by yoink at 10:09 AM on January 6, 2014


Bow before Hájek!
posted by cjorgensen at 10:14 AM on January 6, 2014


It seems like this relates to Gettier problems in some way.
posted by PMdixon at 10:15 AM on January 6, 2014


What's the difference between 'believe' and 'accept' in philosopherese?
posted by 0xFCAF at 10:15 AM on January 6, 2014 [1 favorite]


"Ray, if someone asks if you're a god, you say YES!"
posted by KingEdRa at 10:16 AM on January 6, 2014 [16 favorites]


There are degrees of certainty in ones beliefs, which is where this silly argument falls apart.
posted by empath at 10:16 AM on January 6, 2014 [1 favorite]


That's the joy of monotonic logics.

However if we avoid a certain prestidigitational equivocation in the adumbration, I think all this really means is that you can't assert that p and also assert that you don't believe that p.
posted by Segundus at 10:18 AM on January 6, 2014 [2 favorites]


And when will philosophers finally stop their god-damned eternal wibbling and finally accept that p?
posted by Segundus at 10:21 AM on January 6, 2014 [9 favorites]


you can't assert that p and also assert that you don't believe that p.

Of course I can. There are certainly facts that are true but which I do not believe. Let p be any such fact.
posted by erniepan at 10:23 AM on January 6, 2014 [1 favorite]


"If I believe p, then p" seems like something only a thoroughly self-deluded person could accept. We all must know of thousands of incidents where we "believe p" and p turns out to be false. I guess this is just pointing to a problem in how we define "believe."

Well it's a scoping problem. The way 'accept' is defined, there's no separation between the "I" who is considering the implication, and the "I" who is the subject of the implication.
posted by PMdixon at 10:23 AM on January 6, 2014


Well my thoughts do literally create/alter reality just by me thinking them, so I'm prepared to accept this /bong hit
posted by crayz at 10:24 AM on January 6, 2014 [1 favorite]


you can't assert that p and also assert that you don't believe that p.

But can I assert NP?
posted by ZenMasterThis at 10:24 AM on January 6, 2014 [2 favorites]


...you can't assert that p and also assert that you don't believe that p.

Of course I can. There are certainly facts that are true but which I do not believe. Let p be any such fact.


I think it's better put that you can't assert that you believe p and also assert the negation of p.
posted by So You're Saying These Are Pants? at 10:26 AM on January 6, 2014


What's the difference between 'believe' and 'accept' in philosopherese?

From the section on Belief and Acceptance in the SEP entry on Belief:
Philosophers have sometimes drawn a distinction between acceptance and belief. Generally speaking, acceptance is held to be more under the voluntary control of the subject than belief and more directly tied to a particular practical action in a context. For example, a scientist, faced with evidence supporting a theory, evidence acknowledged not to be completely decisive, may choose to accept the theory or not to accept it.
Hope that helps.
posted by Jonathan Livengood at 10:27 AM on January 6, 2014 [2 favorites]


Follow up upon reading more closely: Both interpretations are addressed. Sorry about that.

I agree that it seems like a trick using the ambiguity of the word "accept".
posted by So You're Saying These Are Pants? at 10:28 AM on January 6, 2014


Of course I can. There are certainly facts that are true but which I do not believe. Let p be any such fact.

You can't assert, of some specific p, both that you believe it and that it isn't the case, on pain of irrationality. For instance, "It's raining, but I don't believe it's raining.".

You can certainly assert that you don't believe some things which are the case.
posted by kenko at 10:41 AM on January 6, 2014


"If I believe p, then p" seems like something only a thoroughly self-deluded person could accept.

The paper is implicitly a criticism of Ramsey's way of construing indicative conditionals.
posted by kenko at 10:41 AM on January 6, 2014 [1 favorite]


You can't assert, of some specific p, both that you believe it and that it isn't the case, on pain of irrationality.

I can assert, and frequently do, that I believe some specific p, but that it may not be the case in actuality. In other words, I am aware that I can be misinformed or mistaken. As I said above, the problem here has to do with what we mean by "believe." Are we using it to describe a psychological state or are we using it to mean something like "holds to be true on the basis of sufficient and infallible evidence."
posted by yoink at 10:47 AM on January 6, 2014


You can't assert, of some specific p, both that you believe it and that it isn't the case, on pain of irrationality. For instance, "It's raining, but I don't believe it's raining.".

How about this fact: Thor is the god of thunder.

That is a fact I don't believe, because I'm not a Norse-themed neo-pagan. But if you were to tell me that Thor is the god of flowers and kittens and garden-parties I would strenuously disagree with you with 100% certainty.
posted by gauche at 10:51 AM on January 6, 2014


Of course I can. There are certainly facts that are true but which I do not believe. Let p be any such fact.

I think this misses the point. The Moorean point isn't about truth but assertion. Many people think there is something odd -- a kind of pragmatic incoherence -- in asserting a proposition and at the same time denying that one believes that proposition. The classic example is the sentence, "It is raining, but I don't believe that it is raining." Putting it a bit more forcefully. Imagine a person saying, "Look, I'm telling you right now, it is raining." And then that same person saying, "I do not believe it is raining." Wouldn't you find such a person's behavior puzzling? I know I would.

The interesting thing in the piece, to me, is combining two seemingly plausible and unrelated points -- (1) the Moorean point about belief and assertion with (2) the Ramsey test for evaluating conditionals -- to reach an obviously absurd conclusion. Something has to go, since we're not epistemic gods, but what is the right assumption to drop? (Maybe empath is right that the problem rests ultimately on an all or nothing understanding of "belief," but given that Hájek is one of the authors, I am skeptical that the puzzle is easily solved that way.)
posted by Jonathan Livengood at 10:52 AM on January 6, 2014 [1 favorite]


If two people are arguing 'if p will q?' and both are in doubt as to p, they are adding p hypothetically to their stock of knowledge and arguing on that basis about q. We can say that they are fixing their degrees of belief in q given p.
Seems reasonable to me... if you interpret it in a Bayesian way. That is, if you think of the probability of p being true as the "prior". The reductio ad absurdum Chalmers and Hàjek use pivots on the sentence immediately after the above: "Let us take the first sentence the way it is often taken..." I'd argue that the problem is the determinism of this "often taken" way of viewing it. As soon as we think of it in an information-theoretic way, so that the probable veracity of p can be updated with future knowledge, the entire criticism evaporates, as far as I can tell. But perhaps that's the entire point--not to criticize the Ramsey quote, but just "the way it's often taken"?
posted by mondo dentro at 10:54 AM on January 6, 2014


Damn that's clever.

Kind of funny nobody noticed this before...
posted by Fists O'Fury at 10:56 AM on January 6, 2014


I accepted this long, long ago and that has made all the difference.
posted by chisel at 10:58 AM on January 6, 2014


I'm not a philosopher, but half way through I just thought "Well, beliefs are often not rational" and stopped.
posted by benito.strauss at 11:00 AM on January 6, 2014


I think i's pretty clear that the culprit here isn't Ramsey, it's Moore. Because you can get other arguments for infallibility and omniscience off the ground without invoking conditionals at all. For instance, taking Moorean paradoxical sentences to be irrational also commits rational agents to accept 'Either I believe p or it is false' for every p. This could also be taken as a proof of a rational agent necessarily accepting that they are omniscient. You don't need to bring conditionals into it.
posted by painquale at 11:07 AM on January 6, 2014


You can't assert, of some specific p, both that you believe it and that it isn't the case, on pain of irrationality.

I think an example would be, to take Moore's example of "I went to the pictures last Tuesday, but I don't believe that I did," maybe you had a few drinks on Tuesday and don't remember going to the movies (so you don't believe you did), but you have a ticket stub in your pocket (so it must be true).

Is it possible in this strict logical system for someone to not know whether or not p, and so not either accept it or refuse to accept it? Honest question for the logicians here.
posted by echo target at 11:09 AM on January 6, 2014


But if one accepts all instances of (1), one should accept that one is omniscient.

This is the "and then a miracle occurs" portion of the argument for me.
posted by ook at 11:09 AM on January 6, 2014


I've got a bad cold + walking pneumonia...and also have barely slept...so this may be a stupid suggestion...I'm damn foggy...

But perhaps one response is to kick things up a level and note that what this really shows is that either I am omniscient or I am irrational (because I must, for some p, accept both p and not (I believe p) (or vice-versa). I know I'm not omniscient. Ergo I must accept that I'm irrational. Which, of course, I already did.

(This is on the philosopher's typical reading of 'irrational', i.e.: not perfectly rational. By 'irrational' most normal people mean, basically, nuts.)
posted by Fists O'Fury at 11:13 AM on January 6, 2014


But painquale, the sentence, "Either p is not true or I believe that p" is equivalent to "If p is true, then I believe that p." So, in your example, you are bringing conditionals into it. ;)
posted by Jonathan Livengood at 11:13 AM on January 6, 2014


Get better FO'F. Had that shit last October. No fun.

...what this really shows is that either I am omniscient or I am irrational...

I was suggesting something sort of similar above. My idea is that it's really showing that trying to apply deterministic logic to what intelligent agents do in a world with limited information leads to silliness.
posted by mondo dentro at 11:16 AM on January 6, 2014 [1 favorite]


Hmmm... again, I'm not exactly at full firepower here, so maybe I should shut up...but...

Is there something akin to the Preface Paradox lurking in there?

Perhaps it's plausible that for any specific p, I can't rationally accept both p and not (I believe p) (also: vice-versa)...but I have no doubt that, for some p or other, both p and not (I believe p)...
posted by Fists O'Fury at 11:17 AM on January 6, 2014


Thanks, MD...

Sorry, I didn't understand your comment above the first time I read it.

Brain no work.
posted by Fists O'Fury at 11:24 AM on January 6, 2014


The classic example is the sentence, "It is raining, but I don't believe that it is raining." Putting it a bit more forcefully. Imagine a person saying, "Look, I'm telling you right now, it is raining." And then that same person saying, "I do not believe it is raining." Wouldn't you find such a person's behavior puzzling? I know I would.

On the other hand, I might look out my window and see water pouring off the tiles above it and hear a drumming sound on the roof above me and say "I firmly believe that it is raining!" But if my neighbor then called me and said "Oh dear, it looks like your hot water heater is kaput and the overflow is pouring out onto your roof" I wouldn't feel myself to be in a baffling logical conundrum. That is, my "belief" in p does not logically entail, for me, the certainty that p must be the case.
posted by yoink at 11:31 AM on January 6, 2014


Came for Demi Moore dancing to a Ramsey Lewis tune, leaving in disappointment.
posted by Greg_Ace at 11:37 AM on January 6, 2014 [2 favorites]


Knowledge is neither justified, true, nor belief. Discuss!
posted by thelonius at 11:37 AM on January 6, 2014 [1 favorite]


But painquale, the sentence, "Either p is not true or I believe that p" is equivalent to "If p is true, then I believe that p." So, in your example, you are bringing conditionals into it. ;)

It's not equivalent if you take Ramsey's analysis of indicative conditionals as correct, is it? 'q given p' isn't the same as '~p or q'.

But I was definitely leaning on the (traditional) equivalence between the two. My point was just that I can give a parallel argument that establishes the same thing as C&H using only Moorean concerns and the truth conditions for 'or' and 'not'. I don't need to bring Ramsey's truth conditional analysis of counterfactuals into it. That strongly suggests to me that Ramsey is not to blame here.
posted by painquale at 11:38 AM on January 6, 2014


Being an applied math geek, not a philosopher, I see the problem as having an overly-simplified model. In reality, not only is there uncertainty in p, but "if p then q", interpreted as a model, means that if p is true, then q is true with probability 1. Probabilistic language sidesteps the entire argument. For example, I could define "I believe p" as meaning "I currently assess the probability that p is true to be greater than some threshold value". Furthermore, in all but the simplest situations, I'd replace "if p then q" with P(q|p), the conditional probability that q is true, given p--which itself has uncertainty!

BTW, I think the argument they're making is very pretty, and I'm assuming it's powerful within the literature of Chalmers and Hàjek's subculture. Again, I'm not a philosopher, but I'm very aware that Chalmers is a major thinker--I've even read some of his stuff! Maybe it's my non-philosopher status, but it's almost like mathematics developed to get past these sorts of logical problems. It's like Zeno's paradox--cute argument, but only until you can handle infinite sums. Then it's silly. Frequently, the appearance of silliness is a clue that the conceptual framework is inadequate for the problem at hand. Indeed, that's my reason for thinking arguments like this one are so important. In this case, my interpretation of the argument is that it's demonstrating primarily that the framework provided by standard logic is inadequate for discussing belief. Again, maybe that's Chalmers and Hàjek's point?
posted by mondo dentro at 11:40 AM on January 6, 2014 [6 favorites]


That is, my "belief" in p does not logically entail, for me, the certainty that p must be the case.

Nor does it for most people working on these issues. Moore isn't saying that you have to be certain about what you believe, only that it is irrational to simultaneously sincerely assert a proposition and believe that it is false. (Lying is not the sort of thing that's supposed to be causing the trouble here.)

I believe that there are chocolates in my kitchen right now. I could be wrong. Maybe my wife moved them. But the possibility that I am wrong does not lead me to refuse to assert that there are chocolates in my kitchen right now. In your water heater case, you surely change your beliefs after getting new information, right? So, by the end of the story, you would not say that you believed it was raining while also asserting that it was not raining. And you would not say that the water heater was leaking while denying that it was leaking.
posted by Jonathan Livengood at 11:46 AM on January 6, 2014 [1 favorite]


It's not equivalent if you take Ramsey's analysis of indicative conditionals as correct, is it?

That's right. I was trying to make a joke but failed.
posted by Jonathan Livengood at 11:53 AM on January 6, 2014


Nor does it for most people working on these issues. Moore isn't saying that you have to be certain about what you believe, only that it is irrational to simultaneously sincerely assert a proposition and believe that it is false. (Lying is not the sort of thing that's supposed to be causing the trouble here.)

But I'm not talking about lying--nobody was "lying" in my example; I had a justified but untrue "belief." I'm talking about the problem with using "If I believe p, then p" as a step towards an argument for omniscience. I'm saying that the step itself is faulty: there is no logical inference from "I believe p" to "p must be the case." I mean, either this is a simplistic tautology: "if I believe p then I must believe p" or it's an absurd overstatement. I do not simultaneously "believe p" and "disbelieve p" but I can "believe p" and simultaneously acknowledge that p might not be the case without any logical absurdity.

Now, if we redefine "believe" to mean "have justified AND TRUE knowledge that" then by all means we can infer "p" from "I believe p" (although we have, again, reduced it to an uninteresting truism). But so what? That still wouldn't be a step towards any useful argument for personal omniscience because we'd need the omniscience in order to know which of our beliefs were "true."
posted by yoink at 11:56 AM on January 6, 2014


And I'm saying that the argument does not require the move that you are correctly rejecting.

Saying that someone believes that p is just to say that he or she thinks that p is true. The person need not be certain that p is true. Believing that p and giving credence (probability) equal to one that p are not usually thought to be the same thing.
posted by Jonathan Livengood at 12:02 PM on January 6, 2014


I can believe in a bunch of enumerated propositions p1, p2, ... where for each pi I have high confidence in pi, and I can also believe that at least one of the pi in which I believe is actually false. For example, suppose I buy a lottery ticket; then I don't expect to win. And equally, I don't expect that my neighbour will win, that his cousin will win, that Mr. Jones down the road will win... and yet I believe that someone will win. So when in the paper they refer to "all instances of (1) are acceptable", that's not to say that someone should then accept the conjunction of all instances of (1).
posted by topynate at 12:04 PM on January 6, 2014 [3 favorites]


And I'm saying that the argument does not require the move that you are correctly rejecting.

Then where does the "infallibility" and "omniscience" come in?
posted by yoink at 12:05 PM on January 6, 2014


All I know is, if you're down with p, well then you're down with me.
posted by prize bull octorok at 12:06 PM on January 6, 2014 [4 favorites]


Someone want to ELIG?
posted by Teakettle at 12:11 PM on January 6, 2014


Then where does the "infallibility" and "omniscience" come in?

Infallibility and omniscience come in the generalization step just criticized by topynate. On one way of looking at it, the mechanics here are very similar to the lottery and preface paradoxes.

Infallibility: Everything I believe is true. The argument purports to show that if you believe that p (i.e., have some high enough credence -- for our purposes, assume anything in the range (0.5, 1]), then you will accept that p is true. Since the argument is the same for every p, you should believe about every p that if you believe p, then p is true. In other words, you should endorse the claim that everything you believe is true.

Omniscience: Everything true is believed by me. The argument purports to show (using the Ramsey test) that if p is true, then you believe that p. Since the argument is the same for every p, you should believe about every p that if p is true, then you believe that p. In other words, you should endorse the claim that everything true is believed by you.

The arguments here don't require belief to be complete certainty. They work (as far as I can tell) if you substitute a probabilistic threshold.

But again, the point here is reductio: no one thinks we are actually infallible and omniscient. So, something has gone wrong with the argument.
posted by Jonathan Livengood at 12:20 PM on January 6, 2014 [2 favorites]


I believe I'll have a ham sandwich.
posted by Obscure Reference at 12:39 PM on January 6, 2014


Jesus. I didn't read the footnote until just now. There's the conditional probability!
posted by mondo dentro at 12:39 PM on January 6, 2014


Teakettle: "Someone want to ELIG?"

Explain It Like I'm... Gay? Goy? a Girl?

OK, assume that there's this really hot Jewish chick, and you BELIEVE she's lesbian and Reform. It makes no sense for you to argue with someone, "I'd have no chance with her; she's obviously a straight, Observant dude", because you don't believe that, so WTF? Argue what you believe, duh.
posted by IAmBroom at 12:45 PM on January 6, 2014 [1 favorite]


Then where does the "infallibility" and "omniscience" come in?

Omniscience comes in with: if p, then I believe that p

and

Infallibility comes in with: if I believe that p, then p.
posted by Fists O'Fury at 12:47 PM on January 6, 2014


Bah, I see that Jonathan Livengood just explained this. Ignore superfluous comment above.
posted by Fists O'Fury at 12:49 PM on January 6, 2014


The SEP entry references
...my favorite joke about Rene Descartes: Descartes is sitting in a bar, having a drink. The bartender asks him if he would like another. “I think not,” he says, and disappears.
posted by XMLicious at 12:52 PM on January 6, 2014


There are p p's; there are things we p that we p.

There are p un-p's; that is to say, there are things that we p we don't p.

But there are also un-p un-p's – there are things we do not p we don't p.
posted by swift at 1:59 PM on January 6, 2014 [5 favorites]


Oops. I just p'ed myself.
posted by mondo dentro at 2:02 PM on January 6, 2014


I find it utterly preposterous that this sort of thing is supposed to describe our actual ways of cognizing the world. <:(
posted by batfish at 2:14 PM on January 6, 2014


Relativity implies that reality depends on an observer. We are limited to observing reality via experiences which allow us to believe things. Thus we are in fact incapable of assuming p; we can only assume an observation which leads us to believe p. Due to these limitations we can only compute the following versions of the statements:

(1) If [I have observed things that led me to believe] p, then I believe p.

(2) If I believe p, then [I have observed things that led me to believe] p.

These do not have divine implications.

lol idk what i'm doing
posted by East Manitoba Regional Junior Kabaddi Champion '94 at 2:19 PM on January 6, 2014


Kind of funny nobody noticed this before...

van Fraasen discusses this in his 1984 article "Belief and the Will". This is a more specific case of the idea of "calibration". Imagine that instead of statements that we "accept" and "reject", everything has a credence value (a probability between 0 and 1). For a calibrated person, 70% of the things that she believes with 70% confidence will be true (and likewise for other percentages). Can you believe you are miscalibrated without being irrational? It doesn't seem like it. If I say "40% of the things that I believe with 50% certainty are true" that seems a contradiction sort of like "p and I don't believe p". But on the other hand, it seems like humility and experience requires that you believe you are NOT calibrated.

By the way, I don't think you can get out of the paradox by making it about a nonspecific p. Suppose that you are open to the possibility that one nonspecific proposition that you believe is not in fact true. Imagine if someone were to be able to present you with all the propositions in the world that you believe. You would have to simultaneously assert that you believe in the proposition that they are not jointly true, but if asked individually you would say that each is true. Thus, jointly, they must all be true. Again, you have the same contradiction.
posted by Philosopher Dirtbike at 2:30 PM on January 6, 2014 [2 favorites]


Isn't this just a problem with belief, though? It seems like a tautological statement about relativism. If you believe something is true, then it is true for you.

Obviously different people have differing beliefs, so unless truth really is relative, then the problem here is with having beliefs, rather than something like the scientific idea that one must only have theoretical models of reality which may or may not reflect any underlying truth.
posted by empath at 3:53 PM on January 6, 2014


Relativity implies that reality depends on an observer

It does no such thing.
posted by thelonius at 4:55 PM on January 6, 2014


Seems like a misuse of implicit quotations marks to me.

(1) should be written: If "p" then "I believe p" (where the implicit speaker is the same for both statements*). But If "p" then "I believe p"If p then I believe p, which is clearly ludicrous (as is If p then "I believe p"). The discussion following (1) in the pdf is actually a discussion of If "p" then "I believe p", not If p then I believe p. The trick in the final paragraph is sliding from from the argument for the truth of the former to claiming to have shown the truth of the latter.

*That is, the phrase If p then I believe p as discussed in the middle portion of the essay is more explicitly written as If "p" then "I believe p", and more explicitly yet, written as If I say/assert "p" then I would say/assert "I believe p". It's paradoxical to believe p and not assert "I believe p", but it's not paradoxical to believe there exist p which are true and yet which I do not believe. However, it is true that I can never catch a example in the wild: any time I do land on a p I believe, boom, I also would assert "I believe p". But that's not a paradox, just a truism. It's kind of like those those computer eyetrackers where you only see squares because they update the triangles on the screen every time your fovea lands on one; you know there are triangles out there, but you can only see squares. But you'd never conclude that because you could never say "I see a triangle," therefore there are no triangles.
posted by chortly at 5:05 PM on January 6, 2014 [2 favorites]


> Relativity implies that reality depends on an observer

It does no such thing.

I believe he meant "realty": Relativity implies that realty depends on an observer. As in, if you're going to buy a house, if you accelerate it to near the speed of light it has fewer square feet from the perspective of someone in the Earth's rest frame due to length contraction^, so you can negotiate the price down.
posted by XMLicious at 5:09 PM on January 6, 2014 [1 favorite]


Yeah, but your mortgage will last freakin' forever.
posted by mondo dentro at 5:16 PM on January 6, 2014 [2 favorites]



> "Relativity implies that reality depends on an observer"

"It does no such thing."

It does if you believe it does.
posted by Golden Eternity at 10:16 PM on January 6, 2014


>>> Relativity implies that reality depends on an observer

>> It does no such thing.

> It does if you believe it does.


Paging Dr. Smullyan...
posted by Philosopher Dirtbike at 4:42 AM on January 7, 2014


« Older Justice does not take the shape of punishment...   |   Good Night Vienna! Newer »


This thread has been archived and is closed to new comments