5 Comments
User's avatar
Silas Abrahamsen's avatar

I have a bit of a pet peeve with fallacies, so I'd be interested to hear your perspective! It seems to me like thinking in terms of fallacies (well, informal fallacies, as formal fallacies are just straightforwardly right) often does more harm than good. For most fallacies, they often flag cases of perfectly fine reasoning (the exception being equivocation, which I think we can also render as a formal fallacy). By thinking in terms of fallacies you then end up pattern-matching arguments rather than actually considering the merits of the inference.

For example, there are plenty of cases where ad hominem is perfectly fine. If you have good reason to think that some person is dishonest or a slimy character, that should somewhat undercut your trust in what they say. Just thinking in terms of fallacies you'd think "ad hominem registered: bad argument" (obviously exaggerating!).

(You do mention that someone who, say, already didn't trust capitalists would already agree. But you can seemingly say that for anyone--anyone who thinks Socrates is a man and that all men are mortal would think that Socrates is mortal--but the point of arguments is usually to make entailments that you hadn't considered salient.)

This is also often what I see in the wild, as well as among my peers who learn about fallacies: People end up trying to look for which labels you can put on some inference, when plenty of nuance might be warranted.

(My favorite example of this is William Lane Craig responding to the argument "you wouldn't have believed in God if you were born somewhere else" that it commits the genetic fallacy. This just seems like a perfect example of fallacies short-circuiting reasoning, so that you miss the point of the argument and where it might have merit.)

It just seems to me that you're better served learning to examine individual arguments themselves (e.g. through considering how that inference would work in analogous case, what features might undercut the inference etc.). Though that is probably much more vague and hard to teach in a course.

But I'm super interested to hear what you think, seeing as you have teaching experience (and I obviously don't). Do you disagree with what I've said, or is it just that fallacies are a necessary evil or some pedagogical stepping-stone? I assume my view wouldn't be so idealistic if I had taught undergrads for several years!

Anyways, that was quite long, but I'm super interested in hearing what you think!

Expand full comment
Daniel Muñoz's avatar

I think you’re really nailing an important point. If you’re just playing “spot the fallacy” with other people’s arguments, you’ll often miss the point of what they’re saying.

Ultimately the goal is to get to a point where you’re not just trying to swat away arguments or use them to whack your opponents on the head. You want to use the arguments to trace out the flow of truth and falsity. If there’s a bad argument, what would it take to make it better? Is it sound within certain limits, or given certain assumptions?

But this shift in mindset is very difficult to teach, since it’s about motivation as much as skill or craft. Students have to want to understand.

Expand full comment
Silas Abrahamsen's avatar

Thank you for the reply! That's certainly an important part. Though I suppose I still worry that fallacies are often not the right tool for the job, even if you have the right mindset.

This might be more of a matter of degree than anything, but it just seems like they're somehow too coarse-grained. By thinking in terms of fallacies you'll be lumping too many babies in with the bathwater, and you risk missing the point despite the right intentions. That's not to say that you can't use fallacies in your thinking and still do a good job, but it will often be despite fallacies not with the help of them.

At least from my own experience, I rarely ever think of fallacies. Perhaps that's just a fault on my part, but it seems to me like they're just an unnecessary heuristic that does at least as much harm than good. After having found a fallacy to put on an argument, I'll still have to "check my work" and then I'll find out whether it was justified or not. But then the categorizing into a certain fallacy was just unnecessary to begin with, and I often find it brings assumptions and connotations with it, rather than bringing clarity.

Still, it seems like it might just be a matter of whether it's a useful heuristic generally, and you're probably in a better position to judge that than I am:)

Expand full comment
Daniel Muñoz's avatar

For what it's worth, I don't generally approach things I read with my Fallacy Goggles on. Except maybe when hunting for ambiguities.

Josh Dever, one of the true greats in philosophy of logic, reputedly once said that the mark of a well-trained philosopher is the ability to spot quantifier scope ambiguities. I don't think it's a *strictly* necessary condition (and idk if Dever actually said this). But I think it's very close!

Expand full comment
Silas Abrahamsen's avatar

Strictly necessary and sufficient!

(For the record I also have no problem with fallacies of ambiguity)

Expand full comment