This is from a while back, but for ten minutes, it’s the most complete, indepth discussion Dr. Tyson’s given on this subject.
One of the most common ways for arguments to go astray is to make an appeal to irrelevant reasons to support the main claim of an argument, or for complex arguments, the resolution of a case.
Many variations of such appeals are similar to arguments from authority, in that the authority is not necessarily a person or a direct statement made by same, in or out of context, but a quality attached to an idea, a product, alleged service, protocol, or treatment.
There are several such appeals to evidence which isn’t.
A few are shown below:
- The appeal to tradition/antiquity — This fallacy lies in inferring that something is true, good, healthy, or works, because it has been in use for a long time, when it’s longevity could simply be the result of social or psychological inertia, or just plain stupidity, and not any real truth, virtues, efficacy, safety, or usefulness of the claim itself. — “But we’ve always held human sacrifices to He Who Nibbles Annoyingly at this time of year to help the crops grow…Why stop now?”
- The appeal to the new/exotic — Speciously inferring that a thing is good, useful, effective, or to be believed because of some perceived unusual or novel quality, regardless of the actual truth of the claims made for it or other relevant quality of the thing. — “This sweater costs a king’s ransom, but is well worth it, for it was knitted from the wool of Alpine mountain goats fed on imported lichens and flora harvested from a boiling subterranean Antarctic lake by trained eunuchs.”
- The appeal to sympathy — This is inferring that a claim is to be believed because those making it are deserving of our pity, sympathy, mercy, or are unjustly treated, when such an inference has no relation to the claim being offered — “Hey, this guy’s gotten short shrift in business for years, so let’s consult him on all of our important foreign policy decisions.”
- The appeal to popularity — This fallacy lies in asserting that something is to be believed because it is widely accepted, when it is easily the case that 7 billion of anybody can indeed be wrong. Indeed, everyone in the universe could believe Azathoth and the Other Gods to be real when that simply would not be the case. — This fallacy, along with appeals to celebrity, is one of the most common used in modern advertising. It is often coupled with the appeal to tradition in some arguments, but is pure poison no matter how it’s used.
- Appeal to unconventionality/antiauthority — A variation of the argument from authority or perhaps a positive ad hominem, in which the claimant’s virtue is perceived to come from opposition to a tyrannical and dogmatic establishment. Indeed, it’s the claimant’s lack of expertise and allegedly revolutionary mindset that is their main claim to authority. — But those matters requiring real expertise are what they are — revolutionary sentiment and bold words do not make a science, art, or good policy.
Those using these fallacies to promote their claims still must bear the burden of proof if they wish to be taken seriously by those doing genuine research work, or not. And if they wish to do so, the first thing to be done is to use evidence for their claims that actually bear on the issues they raise, to avoid these crimes of relevance, for science answers to a higher authority than any one researcher — reality — and while you can fool individual scientists, reality is not so easily fooled, and the truth will come out no matter how facile the argument against it.
I couldn’t have put it better! Another classic from the archives…
The content of this post may contain NSFW language and at least one rather graphic example, not particularly kid-friendly, is provided.
You’ve been warned.
In this installment I’ll discuss the ad baculum argument, or just for the sake of annoying pedantry, because I’m evil like that, the ‘argument from the cudgel,’ or otherwise known as the appeal to force.
This is an informal argument, and often fallacious in its use of an irrelevant appeal, to compel compliance or at least feigned agreement with a conclusion by duress or by the threat of it, whether that duress be physical, psychological, or legal.
It’s a subset of the argument from consequences, and in a simple but possibly vulgar formulation basically amounts to, “Agree with me and do as you’re told, or I’ll kick your ass,” or maybe a bit less crudely, “I’m right because I’m badder and meaner than you are and I can light you up easily. So there.”
There’s also the (in)famous “Do as I say, not as I do,” with the addendum, “…or else!”
It’s fallacious when the threat implied or expressed used has no logical relation to the claim offered, and it aims to exploit a demand for submission to authority and fear as a substitute for valid argument.
This is probably apocryphal, but there’s a classic example I’ve seen on various places on the Web, of a statement of Hitler’s upon hearing the then-current Pope’s displeasure with his policies, whereupon he is to have said, “…and how many tanks does the Pope have?”
Not exactly a rhetorical question…
…and it quite nicely illustrates the specious use of this argument in making use of the idea that ‘might makes right.’
Another example of this is Pascal’s Wager, with its choice, actually a false dilemma, of either belief in God while supposedly losing nothing and a chance at winning everything, or non-belief and risking perdition if one is ‘wrong,’ whatever that’s supposed to mean, since to many religious believers, everyone else’s beliefs, or lack of them as the case may be, are wrong, even intolerable, and sometimes pure evil to boot.
Never mind the underlying self-serving motivation for belief promoted by the Wager, but that’s a subject for a future post…
But an ad baculum argument not always a fallacy, and can have valid applications, such as when the threat, force or punishment invoked has a direct relation to the claims of the argument and is not merely used to overthrow a discussion by substituting intimidation or fear for actual justification of a claim, such as the criminal penalties imposed to support the edicts of various legal systems that certain activities, including but not limited to theft, fraud, and treason, are wrong, or unethical, and should be punished by law, such as by narfling the Garthok, or being consigned to Jabba’s Rancor pit for making bad SF movie references on this blog. Ouch.
- If you read the forbidden (and completely made-up) haiku collection ‘Reflections on Infinity,’ horrible and nasty critters (equally fictitious)from the Outer Void (as made-up as the first two)will show up and slowly eat your brain.
- Attracting the attention of such horrors would be very unpleasant, and worse than death, for madness comes as they eat your brain.
- So to best avoid this unpleasant fate, you must not read ‘Reflections on Infinity.’
Yes, that was a little over the top, but I did say this post wasn’t kid-safe.
Like several forms of argumentation, sometimes fallacies and others not, the valid or invalid use of it is dependent on context, and the use of it for the promotion or squelching of a critical discussion, valid when used for the former, invalid for the latter…
…or sound or unsound, strong or weak, in any case, even though the logic used is probabilistic rather than certain in nature.
Most informal fallacies are not simple matters of incorrect structure having nothing to do with the content of an argument, as with syllogistic logic, but are heavily dependent on the meaning bound up in the language used, for language is inextricably bound into informal argumentation, not mere decorative filler.
A few months back, I got a comment from a visitor to this site on one of my older posts on SF psionic abilities, and thought I’d have a little fun replying to it here in more detail now that I’ve finally come up with a suitable response that doesn’t involve undue rudeness and snarkitude.
Here is the comment in its entirety:
As a healthy skeptic, why don’t you try it out yourself? Google psi beginner exercises or go to psipog.net and do it there. See if maybe you can do some of the things described. If you can, great, you’ve just proven (to yourself) that it is real. If you can’t, well then it reinforces your viewpoint. It’s a win-win situation all around.
This is, with all due respect to the commenter, a good example of the pragmatic fallacy, which basically amounts to, “If (fill in the blank) works for you, then it must be true.”
But this argument puts too much unwarranted trust in personal experience, as useful as it normally is, as being more reliable as a way of knowing than it often is.
Though we get most of the content for our thinking about reality from both direct (from our personal sense data) and secondhand experience (reading and hearing about things from others), as someone with the occasional tendency for self-deception, it’s ironic that my own firsthand experience has itself shown me how unreliable it is in some circumstances.
We can and often do misinterpret our experiences under surprisingly common conditions, causing us to think or believe we are perceiving and experiencing, often in vivid detail, something that we in fact are not, not as it may seem to us. Optical and auditory illusions, including various forms of pareidolia are obvious examples.
And, we may even have experiences of things that do not involve any external stimuli at all.
We humans have a pronounced tendency to hallucinate more frequently than most of us feel comfortable admitting we do.
This last can easily occur under conditions of great stress, fatigue from sleep deprivation, hypnopompic or hypnagogic dreams during sleep paralysis, and even various combinations of our expectations, heightened emotional states, and preexisting beliefs.
To some, seeing is believing, but more accurately, believing is seeing, even when what is seen is not really there.
Our subjective experience can be especially misleading when it involves inferring phenomena, such as those of complex causation by those factors not immediately apparent to us, nor directly apprehended by the physical senses, as is the case with alleged psi-abilities, because of our propensity to see causal patterns in events, oftentimes patterns that do not truly exist, this false pattern discernment involving a number of logical fallacies and cognitive errors that come into play.
So, no…I’d like to, but it’s not a good idea with me.
My propensity to occasionally hoodwink myself in coming to conclusions from untrained, uncontrolled, personal observation under questionable circumstances and of equally questionable sense data, coupled with my hobby of skepticism, and my own consistent experience with how the world appears to work, informs me that just ‘trying something to see if it works for me,’ like claims of psi-abilities, is more than likely to cause me to fool myself.
And that’s something I can do well enough without if it can be helped, thank you much.
An earlier post (Click Me Here) sparked a brief online discussion, partly in the comment threads and part by email, with a friend of mine, more skeptical of man-made global climate change than I am, a position that given his differing political views is entirely understandable.
In the original post, I expressed my own position, somewhat clumsily in retrospect I think, and also expressed a lack of interest in debating the subject, because I was confident then and I’m confident now that the science is largely settled, the fine details of the findings being only slightly less certain than the nuts-and-bolts of the research, and only the political arguments about it and will to do something remaining a real factor.
And I don’t really care for politics…
Well, regardless of who’s right or wrong on this, he and others expressed misgivings about changing my mind. Are they justified? I think it’s dangerous to believe anything absolutely, and no, I do not believe that absolutely – the exception is my pastime enjoying what certainty that can be found in mathematics – and I can give my provisional assent to virtually any claim as long as certain rules are followed.
Ceiling cats of Lovecraft…I’m an arrogant f**k…but I have standards that must be met.
So I thought it would be fun to point out just what those rules are, so as for some to avoid much frustration when trying to convince me of a point, and note that they apply to any sort of extraordinary claim. Here they are, in gruesome detail:
First, I know a thing or two about logical fallacies, and I cannot overstate the importance of minding the soundness and validity of one’s reasoning when attempting to persuade me, rather than crying logical foul and diving face-first to attack arguments I never made and positions I don’t hold. Valid, well-justified arguments are much more likely to win me over.
Regarding anthropogenic climate change: Pro or con, I am not going to be convinced by the same old flawed arguments I’ve seen before on climate skeptic websites, environmentalist sites, nor by the arguments of politicians and political pundits, including Glenn Beck or Al Gore. And no, I haven’t directly seen or read anything by Al Gore, nor do I care about anything he says or does. If you really want to know, I didn’t vote for him in 2000.
Try to address arguments I actually make and state outright, and do try to avoid seeing implications in my arguments that aren’t really there. If I have to point out during a discussion that a supposed implication in an argument doesn’t exist, there’s a good chance that it doesn’t. Really.
Attempting to refute points I don’t raise, like the figures on short-term or local temperature variations, the statistics on the polar bear population, or the specifics of arctic ice melting and refreezing estimates, for a few examples, is a bad idea. The reason I don’t bring up certain talking points in an argument is the fact that I don’t bring up matters I don’t consider valid, relevant, or important. To sum it up, if I don’t bring it up, I’m not arguing it. Make no mistake on that.
Next, don’t use arguments appealing to partisan ideologies, especially ones I don’t happen to have in common with you, whether political, religious, or otherwise obviously slanted toward a particular view and not having anything directly to do with science. Nothing sets my baloney-detector off more easily than arguments with loaded language and obviously biased partisan rhetoric. Use your arguments like a laser beam, not a blunderbuss.
Next: Evidence! Evidence! Evidence! Extraordinary claims require extraordinary evidence, as Carl Sagan once said, and this especially applies to claims that I think are extraordinary, even if you do not. Remember, I may have different criteria for the truth-conditions of the evidence. Be prepared to reference your sources, and make doubly sure that they actually support your claims. And on that last, if they don’t support you, that’s known as a simple honest mistake with careless peeps, and known as ‘lying with footnotes’ with less honest folks. Try to reference sources that we can both agree on the validity of: As long as you don’t quote me Watts Up With That or Faux Nu’z, I won’t reference Al Gore or Treehugger.com to you. Deal?
Be aware that if you do not cite your sources, then I cannot check them and will remain indefinitely skeptical of your argument, and if you do cite your sources, I can check them for myself. Well, there you are, and them’s the rules. Good luck, and always keep your arguments well-polished.