Archive | June 2010

Don’t Be A Dick: the Party

This short film is one of four Public Service Announcements (PSAs) for skeptics, atheists, and others who need a bit of help when it comes to being social and friendly while still staying true to their (lack of) belief.

This episode stars Dan Turner and Louise Crane.

Scripted by Rebecca Watson

Filmed and edited by Charlotte Stoddart

Directed by Adam Rutherford

Title cards by DC Turner

Thanks to Tracy King & DC Turner for the filming location

(video courtesy of rkwatson’s YouTube Channel)

Don’t Be A Dick: The Will

This short film is one of four Public Service Announcements (PSAs) for skeptics, atheists, and others who need a bit of help when it comes to being social and friendly while still staying true to their (lack of) belief.

This episode stars Professor Chris French, Chris Blohm, and Mrs Cat.

Scripted by Rebecca Watson

Filmed and edited by Charlotte Stoddart

Directed by Adam Rutherford

Title cards by DC Turner

Thanks to Tracy King & DC Turner for the filming location

(video courtesy of rkwatson’s YouTube Channel)

Baloney Detection 101 — Self-Deception

Foolish young kitteh...  ...fools himself
Self-Deception is extremely common, and it can often lead us to believe that something is true when it’s not and valid when it’s actually fallacious. It’s extremely difficult for most people to avoid in everyday experience, and sometimes no matter how evidently false or rationally unsound a belief is to others, it’s a primary means of vindicating it to ourselves.

It’s far easier to fool ourselves than it is to fool others, because most people aren’t as introspective as they give themselves credit for. Thinking that one is infallibly self-honest and in general immune to gulling oneself is itself the result of self-deception and a time-tested and reliable way of letting down one’s guard and opening the floodgates for more of the same.

Although self-deception can make us feel better when we are overly critical of ourselves, most psychologists and mental health professionals agree that self-deception is generally a Bad Thing™.

One consequence of it is that the actions that we commit based on it are derived from a fallacious, flawed, incomplete, or outright false view of ourselves or the world, and such actions are consistently unsuccessful.

There are a host of personality elements involved with self-deception, including personal or ideological  bias, self-interest of varying degrees, anything we really want to be true, any petty personal insecurities we may have, all of which, often together can powerfully and sometimes irrevocably influence our will, our very need to believe, and not in a way beneficial to us, especially when the insidious effects of personal experience play an important part in the process.

Deceitfulness, irrationality, or simple personal circumstance can motivate the beliefs that result, since not all of us come equipped with, or take the opportunity to acquire, the mental toolkit for critical thinking needed to make valid or correct conclusions based on our sense data, past experience, and personal knowledge-base.

When it comes to the possible uncritical acceptance of unusual or otherwise questionable claims, there are a few things that need to be carefully considered:

  • Confirmation Bias — the tendency to seek out and give more notice to whatever confirms our prior beliefs, and avoid or downplay that which doesn’t…
  • Selective Thinking — the tendency to pay attention to and remember events we consider significant and meaningful to our beliefs, and to ignore and forget that which doesn’t. This is sometimes oversimplified as ‘counting the hits and forgetting the misses,’ though it can involve paying attention to and remembering significant negative events as well, as long as they are personally meaningful…
  • Apophenia — the tendency to see significance and patterns that do not exist in random noise and incomplete data…

For these reasons, science places a heavy but appropriate emphasis on controlled studies, especially randomized ones, requires reliable replication of reported phenomena by unaffiliated researchers, employs double-blinded — even triple-blinded — testing procedures, makes use of evidential criteria that are clearly defined beforehand, and finally, fully and openly accessible data and whenever possible, detailed procedural records.

There are a lot of people who are convinced, wrongly of course, that having a string of letters before or after their names and world-class intellectual brilliance in general instantly gives them complete immunity to fooling themselves, when in fact this just lets them be better at making up convincing rationalizations for themselves.

So, being a genius doesn’t by any means provide protection from gullibility. In fact, it can make it worse, especially when combined with an unhealthy amount of arrogance. There are a lot of amazingly intelligent people who are nevertheless quite easy to fool — though they’re incredible when it comes to skill with technobabble handwaving. Fnord.

Logical Fallacies — the Moving Goalpost


(This post has been rewritten and reposted with corrections since its original publishing date)

Hey, guys. This post deals with that favorite specious rhetorical tactic of science-deniers and middle-school debating clubs everywhere, the Moving Goalpost.

This fallacy is functionally similar to an American football game in which the player carrying the ball is faced with goalposts that are continuously receding, and no matter how fast he runs or how far he throws the ball, they are always out of reach.

This tactic is one in which ever further-out-of-reach standards of proof or evidence of a claim are used, and the more difficult to meet, the better. Should the requirement for evidence somehow be met, these criteria are then arbitrarily revised to be even more stringent and unreasonable than before, rather than acknowledging that the originally stated criteria have been met.

This fallacy involves redefining one’s claims to put them conveniently out of reach of any possible falsification.

It’s simply a time consuming and extremely roundabout way of answering the question, “What evidence would change your mind about X?” with the statement, “No evidence you can possibly present would ever be enough to convince me!”

This tactic is a favorite of creationists, electric universe proponents, anti-vaccinationists, HIV-deniers, global climate-change contrarians, and Alt-Med advocates, and any other ideologically-motivated denial of uncomfortable or inconvenient facts.

Below are a couple of examples of this in use:

  • Show me just one experiment conducted in a lab on Earth that has ever produced dark matter, directly measured the effect of gravity, created a black hole, a working example of stellar fusion, or replicated the effects of dark energy at laboratory scales!

The above is an excellent example of moving the goalposts out of reach from the very beginning, and there’s the following, in which the requirements for evidence recede each time the argument reused:

  • I want to see just one example of a transitional species between fish and tetrapods before I can accept evolution!
  • Tiktaalik? Now you have two more gaps in the fossil record to fill between Tiktaalik and whatever existed before and after it!
  • What? You’ve filled in those two gaps? Now you have four more gaps to fill! You still haven’t met my requirements! This proves evolution is a sham!

In any constructive discussion, it’s important to state up front just what evidence you will accept for a claim of fact, and to stick with it throughout the discussion rather than making it closer to the point of impossibility to achieve and finally, admit when the standards of evidence have been met.

It should be apparent from these examples that many fringe-proponents can never admit that they are wrong, when demanding ‘just one proof’ and then demanding ‘just one more proof’ and so on, since for champions of Revealed Truth™, it simply wouldn’t do to do something as unbecoming as changing one’s mind.

Cranks, deniers, and fringe theorists often like to claim that they are open to evidence, and that they could be mistaken, but when it comes right down to it they show themselves to be as closed-minded as they like to project onto their critics.

Enhanced by Zemanta

Baloney Detection 101 — Anecdotes

Anecdotes…What are they? An anecdote is an account related directly by one or more eyewitnesses or second-hand, third-hand, or even further removed from the original source, that lacks proper documentation or adequately trained observation. They are usually reports given by casual observers without corroborating evidence, though they are often and fallaciously touted as genuine evidence.

We are a story-telling species, but having a lot of stories to support a claim is not the same thing as supporting it with even a moderate amount of real evidence. No matter how many times you multiply zero, the result is always zero.

Scientifically, anecdotes and personal testimony are best when used to formulate hypotheses, but simply do not work for testing them. Even for the latter, an anecdote or collection of them is only useful under two conditions:

  • It (they) must be true.
  • It (they) must be representative.

Anecdotes are subject to a host of human failings, such as memory fallacies, logical fallacies, selective thinking, confirmation bias, perceptual construction, and embellishment over time, especially when frequently related, and even from the same witness.

The ‘evidence’ of personal experience can be insidiously deceptive, even trumping the need for more objective observation and documentation in the eyewitness’s mind. It is one of the principle ways people come to uncritically accept spurious claims.

This is why a court of law requires the use of other, better forms of evidence to validate the testimony of any witnesses involved in a case.

It’s important to bear in mind that one’s confidence in the accuracy of one’s memories, even ‘flashbulb’ memories is often far out of proportion to how accurate they really are. Memories can fade, warp and mutate with each recollection, becoming more exaggerated and meaningful to the teller and distorting their relation to reality.

Often used to promote medical or paranormal claims not supported by any other form of evidence, and many other pseudoscientific claims, the red flag of anecdotal testimonials should immediately set off one’s baloney detector about whatever products or services, or ideas, are being touted.

Any statement that you need or wish to be so, anecdotes can erroneously lead you to believe. That’s something to think about when you hear that a ‘reliable eyewitness’ swears that something is absolutely true, because ‘he saw it with his own eyes.’ While it is often said that seeing is believing, it’s even more true that believing is seeing.