Discussion about this post

User's avatar
Silas Abrahamsen's avatar

A very thoughtful post--entertaining too! But I don't think I agree with it, so I'll just note some responses.

I'm not sure that my assenting to P entails it seeming to me that P, or vice versa. For example, it might seem to me that it would be wrong to steal the organs from one person to save 5 people, or something, but if I'm a utilitarian, I would no longer believe that it's true. But surely it would still seem to me that it would be wrong, I would simply have other, stronger convictions that entail the falsity of this.

If this is right, then it's a case where I assert the truth of P (P=it's not wrong to steal the organs), but it doesn't seem to me that P. And of course likewise that it seems to me that ~P but I don't believe ~P.

Maybe you want to say that this just couldn't be the case. After all, if your considered judgement is that P is true, then your "considered seeming" must also be that P is true. Fair enough, but from introspection I think that I have things that I believe even though they don't seem right. Even if considered seemings are real, it seems (sorry) like seemings change more slowly than beliefs. In the utilitarian case, I may even eventually have the theory so thoroughly integrated in my thinking that stealing the organs no longer seems wrong (though I have a hard time imagining that), but I think that would come a long time after actually assenting to the truth of utilitarianism. So I think there will at least be some period of time where the two don't coincide.

I still think your argument about recursive justification is interesting though. As a first pass response, I guess I would draw an analogy to the truth predicate. Let's say that the fact that this post is good is evidence that you're a good writer. Then surely "it is true that this post is good" is evidence that you're a good writer, and "it is true that it is true that..." etc.--infinite evidence glitch again! But that of course isn't right. Whatever response you give here, I would guess a similar thing could be said about intuitions.

In the Walsh-Rogan case, I guess I wouldn't say that Walsh provides a new argument when he says that the premises are intuitive. Rather, he simply reports his reasons for believing the premises. Suppose that we're debating whether you're a good writer. I give the argument:

1) This post is good

2) If this post is good, you're a good writer

3) So you're a good writer

If I then say in support of 1 "I read the post and thought that it was good", that doesn't give further reason to believe 1--it's simply a report. Likewise I can then cite litterary conservatism: "If S finds X good upon reading it, that is defeasible justification for X's being good". This again doesn't give extra justification. Nonetheless, this also doesn't mean that I'm not justified in judging you a good writer. Intuitions are supposed to be internal sources of justifications, and so, like my finding your post good, reporting them won't give evidence for another person, except insofar as it serves as some sort of testimony. Maybe reporting my intuitions will also help you realize that you have the same intuitions, making the reason *you* should believe the thing in question apparent--after all, we aren't aware of all the relevant reasons at all times.

You include a quote that suggests this should be a problem, but I don't think it is. I take it that an argument should only have persuasive force for you to the extent that you agree with the premises. If I present you with an argument against some proposition you believe, you will only be persuaded if the credences you have in the premises are inconsistent with the credence you have in the conclusion.

Suppose that Walsh and Rogan were--per impossible--to figure out *all* of the implications of their views, presenting all possible arguments against each other's views, such that the other person had some credence above zero in the premises. They would consider all these arguments and adjust their credences and beliefs accordingly. Assuming that they still didn't agree, each person would end up with a set of propositions where no new argument could rationally persuade them--Walsh would believe P and Rogan would believe ~P, and that would be the end of the story; they would just have differing brute intuitions.

While that is a sad outcome, surely it would still be irrational for each person to change their beliefs--they should definitely not believe something that *doesn't* seem right to them in this situation. I just think it's a sad fact that some people (probably most) just have differing intuitions, so that they could only come to agree through irrational means. Nevertheless, that seems like the best we can do.

Expand full comment
Willy, son of Willy's avatar

I went to a magic show. It seemed like the woman was cut in half, but I know she wasn't.

This shows it's possible to assert something that is contrary to your seemings.

When I think of intuition talk, I think of it as an invitation to the interlocutor: "hey this seem true to me, does it also seen true to you?". Because if it does not, then we have to debate the issue. Whatever score I get from my seeming, you also get from yours.

Expand full comment
19 more comments...

No posts