Chris Parsons on Yahoo! news posted this article on Will Storr’s new book ‘The Heretics: Adventures With The Enemies Of Science,’ and his post raised a few livid crimson flags, which I’ll point out below. His article starts innocently enough, though with the statement,
“Will Storr is a man who deals in facts. As a journalist of more than 10 years, undeniable evidence and rational data are his bread and butter. There are groups of people, however, who deny the irrefutable; who see cold, hard facts as mistruths or simply inconvenient.”
Parsons goes on to include among these people Holocaust deniers, creationists and UFO believers, though I may be excused for a little suspicion, given what I’ve learned about the rank-and-file of journalistic media over my 6 years as a skeptic, of the possible argument from authority fallacy in Parson’s first two sentences, which sounds to me like few journalists in most outlets I’ve heard or read.
Not to diss journalists as a whole, but fewer than should be in the profession are as fact-based as is being implied. But let’s grant, shall we, that Mr. Storr is one of the few who sticks to facts, rationality and knows what he’s doing, though I’ve never heard of him before now.
Parsons continues, with this:
“So why are there intelligent, seemingly rational people like this, who are capable of such unreasonable logic? The question is the subject of Storr’s new book, which explores the ‘beliefs of non-believers’. Put simply, he wants to know why ‘facts don’t work’.”
Nothing wrong with this, since having beliefs in denial of facts is not inconsistent with intelligence, education, or otherwise rational thinking in people. We do tend to rationalize our beliefs and cherry-pick our data to support them through confirmation bias, as Parsons notes, and the smarter and better educated we are, the better we are at rationalizing.
This has already been done, and given what follows in quotes below, a much better job of it, with Mike Shermer’s book “The Believing Brain,’ in 2011. Here’s my review of it.
My skeptic sense began to tingle when I saw this, flipping the “on” switch of my baloney detector:
“Storr studies not only the thought process behind conspiracy theories, but also the unwavering rationalism of their opponents.”
My suspicions were put on full alert, my logical fallacy meter suddenly hitting the “red” end of the scale a few lines down, when I saw this:
“He attends a conference of ‘sceptics’, who insist there is ‘no evidence for homeopathy’. When he asks the sceptics what scientific literature on homeopathy they’ve read to support these claims, many admit they haven’t read any. This isn’t to say that homeopathy isn’t legitimate – merely that many ‘rationalists’ dismiss it because they don’t want to believe it in the first place.”
WTF?? What a remarkably naive and illogical set of statements…a perfect trifecta of an ad hominem dismissal, a straw-person fallacy, and a blatant argument from ignorance. Since this statement isn’t in quotes I’ll assume it’s in Parsons’ own words, at best a paraphrase.
The statement attempts to dismiss skeptical criticism of homeopathy by completely misrepresenting the skeptical position, trotting out the tired old argument accusing skeptics of being closed-minded, and at once attempting to improperly shift the burden of proof by arguing from a lack of data that “If you haven’t seen any scientific proof that (insert favorite belief) isn’t true, then it must be legitimate. You just don’t want to accept!”
Skeptics dismiss homeopathy with good reason: It’s scientifically implausible on its face, and most importantly, there are no valid well-controlled studies that exist in the literature supporting its therapeutic use at all, except maybe curing your thirst. Every adequate double-blind study in the literature conducted so far has yielded no medical efficacy at all for homeopathy. It has either never been found to work, or has been found not to work.
It’s essentially magic water.
Accepting homeopathy as science would force us to discard major portions of physics and chemistry without any reason, given the lack of evidence and silly arguments advanced for it.
Don’t believe me? See for yourself. I suggest checking out JSTOR, the Lancet, the New England Journal of Medicine, Nature, Science, and PubMed. Be aware that you may have to pay a subscription or article download fee. If nothing else, take a look at the abstracts given for the articles. I have, on those occasions when buying the article was for whatever reason not an option.
Don’t trust peer-reviewed journals because you think the peer-review process and science as a whole are broken? Sorry, but I can’t help you there.
Well, given this I’ll probably buy the book, but this article doesn’t really raise my expectations.
This is from a while back, but for ten minutes, it’s the most complete, indepth discussion Dr. Tyson’s given on this subject.
What does it mean to lie? In a strict sense, lying means relating a falsehood that one knows on some level to be just that — false. We are not always honest with ourselves, and our brains have many ways to deceive themselves. We are frequently convinced, often quite strongly of things simply not true, tenaciously holding some falsehoods as self-evident truths.
It seems paradoxical to be able to believe what we know to be false, so how then may it be possible to fool ourselves in that manner and still be aware, at least intellectually, that we are doing so?
First, a bit of what it means to ‘know’ something. Knowledge at it’s most basic level involves both awareness of an idea, or probable fact, and its acceptance.
Knowledge involves belief that something is the case or that something is not, that belief needed to fully grasp the intricacies and nuances of what is known, but we are not all the way there yet. We have a couple more steps to go…
To know something, we must not only be informed of a thing and believe it to be the case, but it must actually conform to existing facts — it must be true. Not just this, but there must be some grounds for believing it, and convincing ourselves that we have knowledge, and not just a lucky guess on our part.
Strictly speaking, you can’t really know something that’s false, and you can’t truthfully say you know something without good grounds…
There must be some information available, usually gained by our senses, by which we obtain those grounds and the justification for the item of knowledge we possess — some channel of information must necessarily and sufficiently complete the picture, so that we can confidently think that we know something.
Whether these grounds come from our own personal sensory experience, often enhanced by our instruments and other artifacts, secondhand or further removed testimony given by others (usually needing some kinds of grounding itself, like real and relevant expertise of the source giving the testimony…) and possibly other channels of information as well, we must have evidence, and it must be strong enough to justify the claim we accept.
We surely deceive ourselves, convince ourselves of probable falsehoods and we often hold conflicting beliefs by walling them off from each other — and even using processes of doublethink and rationalization to entertain them without the discomfort we often experience when both or all come to our conscious awareness at the same time.
It’s possible to have a lack of confidence in one’s knowledge, the niggling doubt that sometimes happens to those of us who hold all knowledge to be subject to correction with further and better grounds to retain, reject or amend what we know at any given time.
With doublethink, rationalization, logical fallacies and belief in belief — believing for the sake of belief itself as a virtue — we may hold at least as partially true what we intellectually know (and thus to an extent accept) to be false, and move from compartmentalizing our accepted and conflicting claims into the territory of the pious fraud, further into those of the pathological liar and victims of False memory syndrome…
…as well as when we willfully sacrifice the value of reason and evidence in favor of what feels good to us, rather than the uncomfortable realities we are often forced to deal with in daily life.
When we are faced with an incentive to believe something that agrees strongly with our prejudices, it comes down to a simple matter of “Can I believe?” that we may not even have to ask. We accept such things on a whim, unless we exercise care in our thinking. We have a tendency to first believe things we have an emotional investment in, and then cobble together, often quite ingeniously, reasons to justify our belief.
When we are faced with those things we are disinclined to believe, things contrary to our ideologies, or belief/value paradigms, it’s a matter of “Must I believe?” as though we are being faced with an uncomfortable choice and go immediately on the defensive with frequently clever rationalizations we muster to attack the discomforting idea and defend our belief structures from harm.
But I would add a third option, shown by thinkers and investigators I’ve known, read and listened to who approach certain…nonscientific and scientific…topics as intellectual curiosities or academic subject matter without a clear vested interest in accepting or rejecting the claims that these concern:
“Should I believe? Do I have sound reasons to accept this claim as true, or do I have sound reasons to reject it as false, or worse, as not even wrong?”
Often these questions aren’t even consciously asked by those believing, disbelieving, or suspending both until the data are in.
But the first two involve belief or disbelief first, followed by a attempts at conscious justification, often subjectively ironclad, and often fallacious, whereas the third involves deliberation, a weighing of evidence and argument, followed by a tentative conclusion, possibly with leanings toward either end of a continuum of credulity to denial, but a conclusion subject to newer and better information and reasoning as they are presented.
The third option is uncommon, and involves thinking unfamiliar to most of us, but as it occurs with perfectly normal human brains operating with the proper training and accumulated habit, it is every bit as human as reflexive acceptance or knee-jerk rejection.
It’s something we probably did not specifically evolve to do, but like playing a piano, also without a direct adaptive function, we can learn to do, and quite skillfully for many of us.
I think it’s something worth doing, but it requires that we consciously override some of our impulses, consider our thinking, our motivations, and mind the soundness of our reasoning and solidity of the facts we claim, and always consider that these things all have limits — they are fallible, but used well and with care, reliable and effective as paths to real knowledge.
We must consider the input and critiques of others, for alone we are prone to misleading ourselves, even the smartest and best educated of us, with our own biases and fallacies of thinking and memory.
To quote the late Carl Sagan, “Valid criticism does you a favor.”
This is why modern science acts as a community, so that research workers can get public input from their colleagues, cross-check their findings, and it is the reason that external replication of results is of the greatest importance — one-off phenomena that are impossible to verify objectively are of little use, and any finding must at least in principle be testable, or it cannot be demonstrably known.
Scientific inquiry works as effectively as it does, because unlike any other set of methods, it can tell us when we are wrong, and even when we are, to sometimes continue to reap discoveries from failed ideas that lead to new territory.
Is there something better than this now? Will there be, ever?
I don’t know, to both questions. If such a set of methods exists, I’ve not heard of it, and apparently, neither has anyone else I know of.
But if and when something superior comes along, that more effectively and accurately does what scientific inquiry, and as part of it, skeptical inquiry does at the moment, then I shall happily change my mind about science and support whatever works best instead.
- Why science is pseudo-science Debunked (areycorneja.wordpress.com)
- There Is Nothing Wrong With “I Don’t Know” (randi.org)
- A Definition of Critical Thinking (susanjeddington.wordpress.com)
- Encouraging Scientific Inquiry in Classrooms (theepochtimes.com)
- Postmodernism And Science – Pt: 1. (zaknafein81.wordpress.com)