A while back, I was at what was then the game-shop I went to on the weekends. I was chatting with a friend of mine who expressed a view I found somewhat disturbing, but not surprising from some in this country — that he found someone to be more credible and more trustworthy in making obviously unreasonable statements, and another worthy of suspicion for sounding more reasonable, because as he put it “you at least know where he (the unreasonable man) stands.”
Let’s unpack this and see what’s being said…
It indicates a distrust of rationality and reason itself as something untrustworthy because it’s easy to use clever argumentation to mislead and deceive. It seems to be saying that an openly irrational person is not hiding his stance beneath a cloak of deceit.
This ignores the difference between fallacious reasoning to deceive, and sound or cogent reasoning as a way to discover the truth, not hide it.
I’m reminded of Martin Luther’s argument warning that reason was deceptive, using reason itself to make his point, as ironic and logically inconsistent as that is…
Logical and rhetorical fallacies are the tools of rationalization and propaganda — you cannot support a false position using good reasoning and sound premises, nor can you reliably reach a sound position using bad logic and presuming facts not in evidence or out of context.
But it is itself fallacious to conflate bad reasoning with good, to reject both out of hand. Reason is fallible, and that’s why there are tests we can use to evaluate its soundness. That’s one reason that skeptics keep themselves up on logical fallacies as well as techniques of good reasoning as a means of reaching reliable conclusions.
I don’t know about you, but I find the more reasonable types more trustworthy, not less. Rationality not merely feigned implies a willingness to discuss disagreements, though false rationality is one reason why I find religious apologetics and ideologically-motivated denial of science (on both the left and the right) so unworthy of credibility.
Certainly, it’s a good idea to turn a skeptical eye to even reason and rationality, to better let us know when they’re being misused to our detriment, but all else being held the same, we need more reasonable people, not fewer. The less rational types are simply making their destructive attitudes more obvious, and more dangerous. It takes a reasonable man to have a reasonable stance.
That, and at least you can discuss things with reasonable people without getting shot or stabbed.
- Debunking the Theist’s Appeal to Authority Fallacy (scepticalprophet.wordpress.com)
- The “logical fallacy” poster… (leiterreports.typepad.com)
- Introduction to Logical Fallacies (Workshop Style) (trippleblue.wordpress.com)
- 9 Ways to Create an Unbeatable Argument (scepticalprophet.wordpress.com)
Chris Parsons on Yahoo! news posted this article on Will Storr’s new book ‘The Heretics: Adventures With The Enemies Of Science,’ and his post raised a few livid crimson flags, which I’ll point out below. His article starts innocently enough, though with the statement,
“Will Storr is a man who deals in facts. As a journalist of more than 10 years, undeniable evidence and rational data are his bread and butter. There are groups of people, however, who deny the irrefutable; who see cold, hard facts as mistruths or simply inconvenient.”
Parsons goes on to include among these people Holocaust deniers, creationists and UFO believers, though I may be excused for a little suspicion, given what I’ve learned about the rank-and-file of journalistic media over my 6 years as a skeptic, of the possible argument from authority fallacy in Parson’s first two sentences, which sounds to me like few journalists in most outlets I’ve heard or read.
Not to diss journalists as a whole, but fewer than should be in the profession are as fact-based as is being implied. But let’s grant, shall we, that Mr. Storr is one of the few who sticks to facts, rationality and knows what he’s doing, though I’ve never heard of him before now.
Parsons continues, with this:
“So why are there intelligent, seemingly rational people like this, who are capable of such unreasonable logic? The question is the subject of Storr’s new book, which explores the ‘beliefs of non-believers’. Put simply, he wants to know why ‘facts don’t work’.”
Nothing wrong with this, since having beliefs in denial of facts is not inconsistent with intelligence, education, or otherwise rational thinking in people. We do tend to rationalize our beliefs and cherry-pick our data to support them through confirmation bias, as Parsons notes, and the smarter and better educated we are, the better we are at rationalizing.
This has already been done, and given what follows in quotes below, a much better job of it, with Mike Shermer’s book “The Believing Brain,’ in 2011. Here’s my review of it.
My skeptic sense began to tingle when I saw this, flipping the “on” switch of my baloney detector:
“Storr studies not only the thought process behind conspiracy theories, but also the unwavering rationalism of their opponents.”
My suspicions were put on full alert, my logical fallacy meter suddenly hitting the “red” end of the scale a few lines down, when I saw this:
“He attends a conference of ‘sceptics’, who insist there is ‘no evidence for homeopathy’. When he asks the sceptics what scientific literature on homeopathy they’ve read to support these claims, many admit they haven’t read any. This isn’t to say that homeopathy isn’t legitimate – merely that many ‘rationalists’ dismiss it because they don’t want to believe it in the first place.”
WTF?? What a remarkably naive and illogical set of statements…a perfect trifecta of an ad hominem dismissal, a straw-person fallacy, and a blatant argument from ignorance. Since this statement isn’t in quotes I’ll assume it’s in Parsons’ own words, at best a paraphrase.
The statement attempts to dismiss skeptical criticism of homeopathy by completely misrepresenting the skeptical position, trotting out the tired old argument accusing skeptics of being closed-minded, and at once attempting to improperly shift the burden of proof by arguing from a lack of data that “If you haven’t seen any scientific proof that (insert favorite belief) isn’t true, then it must be legitimate. You just don’t want to accept!”
Skeptics dismiss homeopathy with good reason: It’s scientifically implausible on its face, and most importantly, there are no valid well-controlled studies that exist in the literature supporting its therapeutic use at all, except maybe curing your thirst. Every adequate double-blind study in the literature conducted so far has yielded no medical efficacy at all for homeopathy. It has either never been found to work, or has been found not to work.
It’s essentially magic water.
Accepting homeopathy as science would force us to discard major portions of physics and chemistry without any reason, given the lack of evidence and silly arguments advanced for it.
Don’t believe me? See for yourself. I suggest checking out JSTOR, the Lancet, the New England Journal of Medicine, Nature, Science, and PubMed. Be aware that you may have to pay a subscription or article download fee. If nothing else, take a look at the abstracts given for the articles. I have, on those occasions when buying the article was for whatever reason not an option.
Don’t trust peer-reviewed journals because you think the peer-review process and science as a whole are broken? Sorry, but I can’t help you there.
Well, given this I’ll probably buy the book, but this article doesn’t really raise my expectations.
The powers of the paranormal, if they exist, cannot be very great if they are so easily thwarted by mere doubt. It seems as though, in the world of supernatural claims, doubt is the strongest magic of all. It can cancel anything, except science, which actually needs it to work. At least, this is the impression I get from the claims of paranormal believers when attempts to replicate initially successful parapsychology studies fail. And fail they have once the controls of the initial study are improved, reducing statistical significance closer to chance levels and shrinking effect-size to zero.
It seems to me that even with perfect methodology there would still be a chance for false-positive results, and that what these studies show is not what they are claimed to — only that something other than chance may be at work, and giving no indication of what that may actually be. It could be due to poor experimental design, inappropriate use of statistics, errors in reasoning, bad data collection, and rarely, but often enough to taint the entire field of study, fraud.
One thing never fails, though, and that’s the rationalizations offered for this failure to replicate. This post deals with a species of error in reasoning: Special Pleading, the Post Hoc [after this, or “after the fact”]fallacy, or Ad Hoc [for this (only)]hypothesis, and sometimes just “covering your ass by making shit up.” I also aim to show that it is not always a fallacy under the right circumstances.
This fallacy, regardless of its name, is an attempt to rescue a claim from disproof by inventing special reasons why it should be exempt from the usual standards of evidence, to deflect criticism without demonstrating that these alleged reasons are in fact true or actually exist apart from the claim they attempt to defend. Every attempt has been made to boil the following examples of its use down to their essence and to avoid committing straw-persons:
Psi phenomena are shy, or jealous. They do not work in the presence of skeptics. Skeptical doubt cancels them.
What about this one?
Successful replications do occur, but the doubt of skeptics reading the journals they are published in reaches back through time, retroactively changing the experiment and causing it to fail.
Psi is elusive and a delicate phenomenon. Imposing excessively strict controls (read: adequate ones) in a study impedes Psi’s natural functioning in a sterile laboratory setting.
What I find interesting about this sort of reasoning in its fallacious form is that it is considered acceptable in some circles.
Never mind that many of the replications are attempted by other believers and by those without an apparent bias against the paranormal, and another such rationalization goes something like:
They (believers or neutral parties who don’t get results) are burdened with a repressed skepticism that causes their replication attempts to fail, no matter what belief or neutrality they claim to have. These hidden attitudes unconsciously sabotage their efforts.
Never mind the fact that this argument is made on the basis of mere supposition and absent the use of a valid psychological test. Those who reason thus are essentially claiming to be able to read minds, the very thing that some of these replication attempts have failed to demonstrate.
This phenomenon, the success of some to get positive results in their studies and others to get negative results based on their belief systems, is in parapsychology known variously as the Shyness Effect, the Wiseman Effect, or, in a form broadly applying to any field of science where attitudes may unconsciously influence results, the Experimenter Effect, or Observer-expectancy effect, and this is one of the reasons for double-blinding studies and other forms of experimental controls.
A good example was a series of medical studies for a procedure known as the portacaval shunt, and in the analysis of these studies, it was discovered that those who were more enthusiastic about the procedure tended to get false-positive results more often than those not so inclined. And this was from a study assessing an experimental surgical method, not magic mind-powers.
Above were some examples of this form of argument used as a fallacy, but are Ad Hoc hypotheses always and everywhere bad reasoning?
This can be a perfectly good way of reasoning, as long as at least one of the following conditions is met:
- The reason for failure to demonstrate something has already been shown, or can be, to exist independently of the hypothesis it is used to support. There must be valid evidence that it is true and relevant as a viable supporting reason.
- The Ad Hoc hypothesis is both interesting and fruitful in predicting new phenomena that could in principle be tested even without being true or existing itself. The key point is that it must be testable, whether by verification or falsification if a general or a particular claim.
- The Post Hoc reasoning is used to invent new and creative ways to test a claim, and as long as it is used to further inquiry and not merely to thwart the goal of critical reasoning by making up silly excuses as needed.
A good example of an Ad Hoc hypothesis that was both interesting and fruitful was Einstein’s addition to General relativity of the Cosmological Constant, which though he later rejected it and called it his “greatest blunder” has shown to be useful today in the concept of Dark Energy to explain the accelerating expansion of the universe. Another would be the the Lorentz contraction offered to explain the failure of the Michelson-Morley experiment to detect the Earth’s motion through the Ether, later incorporated into Einstein’s Special relativity.
One thing to note about many forms of argument used as fallacies:
Philosophers and communications specialists may differ on this, but informal fallacies are not so much violations of argument form as they are violations of argument procedure, as attempts to subvert the rules and goals of constructive argument and critical discussion. In this sense, they are abused, often out of ignorance but sometimes out of intellectual dishonesty, as rhetorical devices masking themselves as cogent arguments when they are not. For ethical, productive argumentation, try to keep this in mind and avoid this yourself whenever possible. Happy debating.
This is from a while back, but for ten minutes, it’s the most complete, indepth discussion Dr. Tyson’s given on this subject.
What does it mean to lie? In a strict sense, lying means relating a falsehood that one knows on some level to be just that — false. We are not always honest with ourselves, and our brains have many ways to deceive themselves. We are frequently convinced, often quite strongly of things simply not true, tenaciously holding some falsehoods as self-evident truths.
It seems paradoxical to be able to believe what we know to be false, so how then may it be possible to fool ourselves in that manner and still be aware, at least intellectually, that we are doing so?
First, a bit of what it means to ‘know’ something. Knowledge at it’s most basic level involves both awareness of an idea, or probable fact, and its acceptance.
Knowledge involves belief that something is the case or that something is not, that belief needed to fully grasp the intricacies and nuances of what is known, but we are not all the way there yet. We have a couple more steps to go…
To know something, we must not only be informed of a thing and believe it to be the case, but it must actually conform to existing facts — it must be true. Not just this, but there must be some grounds for believing it, and convincing ourselves that we have knowledge, and not just a lucky guess on our part.
Strictly speaking, you can’t really know something that’s false, and you can’t truthfully say you know something without good grounds…
There must be some information available, usually gained by our senses, by which we obtain those grounds and the justification for the item of knowledge we possess — some channel of information must necessarily and sufficiently complete the picture, so that we can confidently think that we know something.
Whether these grounds come from our own personal sensory experience, often enhanced by our instruments and other artifacts, secondhand or further removed testimony given by others (usually needing some kinds of grounding itself, like real and relevant expertise of the source giving the testimony…) and possibly other channels of information as well, we must have evidence, and it must be strong enough to justify the claim we accept.
We surely deceive ourselves, convince ourselves of probable falsehoods and we often hold conflicting beliefs by walling them off from each other — and even using processes of doublethink and rationalization to entertain them without the discomfort we often experience when both or all come to our conscious awareness at the same time.
It’s possible to have a lack of confidence in one’s knowledge, the niggling doubt that sometimes happens to those of us who hold all knowledge to be subject to correction with further and better grounds to retain, reject or amend what we know at any given time.
With doublethink, rationalization, logical fallacies and belief in belief — believing for the sake of belief itself as a virtue — we may hold at least as partially true what we intellectually know (and thus to an extent accept) to be false, and move from compartmentalizing our accepted and conflicting claims into the territory of the pious fraud, further into those of the pathological liar and victims of False memory syndrome…
…as well as when we willfully sacrifice the value of reason and evidence in favor of what feels good to us, rather than the uncomfortable realities we are often forced to deal with in daily life.