Archive | June 2010

Don’t Be A Dick: the Party

This short film is one of four Public Service Announcements (PSAs) for skeptics, atheists, and others who need a bit of help when it comes to being social and friendly while still staying true to their (lack of) belief.

This episode stars Dan Turner and Louise Crane.

Scripted by Rebecca Watson

Filmed and edited by Charlotte Stoddart

Directed by Adam Rutherford

Title cards by DC Turner

Thanks to Tracy King & DC Turner for the filming location

(video courtesy of rkwatson’s YouTube Channel)

Don’t Be A Dick: The Will

This short film is one of four Public Service Announcements (PSAs) for skeptics, atheists, and others who need a bit of help when it comes to being social and friendly while still staying true to their (lack of) belief.

This episode stars Professor Chris French, Chris Blohm, and Mrs Cat.

Scripted by Rebecca Watson

Filmed and edited by Charlotte Stoddart

Directed by Adam Rutherford

Title cards by DC Turner

Thanks to Tracy King & DC Turner for the filming location

(video courtesy of rkwatson’s YouTube Channel)

Baloney Detection 101 – Self-Deception

Foolish young kitteh...  ...fools himself
Self-Deception is extremely common, and it can often lead us to believe that something is true when it’s not and valid when it’s actually fallacious. It’s extremely difficult for most people to avoid in everyday experience, and sometimes no matter how evidently false or rationally unsound a belief is to others, it’s a primary means of vindicating it to ourselves.

It’s far easier to fool ourselves than it is to fool others, because most people aren’t as introspective as they give themselves credit for. Thinking that one is infallibly self-honest and in general immune to gulling oneself is itself the result of self-deception and a time-tested and reliable way of letting down one’s guard and opening the floodgates for more of the same.

Although self-deception can make us feel better when we are overly critical of ourselves, most psychologists and mental health professionals agree that self-deception is generally a Bad Thing™.

One consequence of it is that the actions that we commit based on it are derived from a fallacious, flawed, incomplete, or outright false view of ourselves or the world, and such actions are consistently unsuccessful.

There are a host of personality elements involved with self-deception, including personal or ideological  bias, self-interest of varying degrees, anything we really want to be true, any petty personal insecurities we may have, all of which, often together can powerfully and sometimes irrevocably influence our will, our very need to believe, and not in a way beneficial to us, especially when the insidious effects of personal experience play an important part in the process.

Deceitfulness, irrationality, or simple personal circumstance can motivate the beliefs that result, since not all of us come equipped with, or take the opportunity to acquire, the mental toolkit for critical thinking needed to make valid or correct conclusions based on our sense data, past experience, and personal knowledge-base.

When it comes to the possible uncritical acceptance of unusual or otherwise questionable claims, there are a few things that need to be carefully considered:

  • Confirmation Bias — the tendency to seek out and give more notice to whatever confirms our prior beliefs, and avoid or downplay that which doesn’t…
  • Selective Thinking — the tendency to pay attention to and remember events we consider significant and meaningful to our beliefs, and to ignore and forget that which doesn’t. This is sometimes oversimplified as ‘counting the hits and forgetting the misses,’ though it can involve paying attention to and remembering significant negative events as well, as long as they are personally meaningful…
  • Apophenia — the tendency to see significance and patterns that do not exist in random noise and incomplete data…

For these reasons, science places a heavy but appropriate emphasis on controlled studies, especially randomized ones, requires reliable replication of reported phenomena by unaffiliated researchers, employs double-blinded — even triple-blinded — testing procedures, makes use of evidential criteria that are clearly defined beforehand, and finally, fully and openly accessible data and whenever possible, detailed procedural records.

There are a lot of people who are convinced, wrongly of course, that having a string of letters before or after their names and world-class intellectual brilliance in general instantly gives them complete immunity to fooling themselves, when in fact this just lets them be better at making up convincing rationalizations for themselves.

So, being a genius doesn’t by any means provide protection from gullibility. In fact, it can make it worse, especially when combined with an unhealthy amount of arrogance. There are a lot of amazingly intelligent people who are nevertheless quite easy to fool — though they’re incredible when it comes to skill with technobabble handwaving. Fnord.

Project Logicality | Moving Goalposts

Have you ever had a discussion with someone who never lets you convince them on some matter no matter the strength of the evidence? What’s going on when people, even perfectly sane, otherwise rational, and sincere people give excuse after excuse as to why the evidence and argument just isn’t ‘good enough’ to compel their assent no matter its truth or validity?

Note that it is easily possible to set the bar for evidence too low, but we can also make it impossible, or virtually so, to reach.

So here, I deal with a favored rhetorical tactic of cranks, pseudoscientists, grand conspiracy theorists, charlatans of all stripes, and yes, ordinary people in everyday discussion: the Moving Goalpost.

Most people are fairly closed-minded and find changing their stance on things uncomfortable. It takes good metacognitive skills, thinking about thinking, to correct this tendency.

The fallacy takes its name from an analogy with American football, in which the goalposts are always out of reach of whoever is carrying the ball, and continue to recede further still.

With this tactic, the more unreasonable the standard of proof for refuting or confirming the claim, the better. It involves either arbitrarily redefining one’s claims to put them conveniently out of reach of any disproof, or setting impossible standards from the very beginning.

The objective here is to avoid having to rescind whatever claims one is making, when one has a political, financial, personal, or ideological stake in a position. For some, no amount of evidence and reason is enough, and it shows in this use of rhetoric.

A couple of examples might be:

Show me just one experiment conducted in a lab on Earth that has ever created dark matter, directly measured gravity, manufactured a black hole, or generated controlled stellar fusion! Establishment Cosmology™ is silly, fallacious, and wrong!

This argument clearly sets impossible standards from the beginning.

It and what follows use a version of the “show me just one proof” gambit common among creationists and crank cosmology proponents (Sometimes those are one and the same!).

The next illustrates shifting standards of proof each time evidence is presented:

I want to see any example of a transitional species before I think evolution even remotely plausible! Just one!

Tiktaalik? Ambulocetus?

There are still gaps in the fossil record between those and what came before and after! Where’s the evidence for those??

You’ve filled in those gaps?

Now there are more gaps to fill! Fraud! Fake! Amoral evilutionist! Evolution is a sham!

It’s important to proportion to the claim just what criteria of evidence and logic you will accept, and to stick with that as your gold standard throughout. Set reasonable standards, then admit it and change your mind once those standards have been met.

Consistency might be called the bugaboo of small minds, but it’s what’s needed when assessing claims open-mindedly and rationally.

Tf. Tk. Tts.

(Last Update: 2017.06.06)

Baloney Detection 101 — Anecdotes

Anecdotes…What are they? An anecdote is an account related directly by one or more eyewitnesses or second-hand, third-hand, or even further removed from the original source, that lacks proper documentation or adequately trained observation. They are usually reports given by casual observers without corroborating evidence, though they are often and fallaciously touted as genuine evidence.

We are a story-telling species, but having a lot of stories to support a claim is not the same thing as supporting it with even a moderate amount of real evidence. No matter how many times you multiply zero, the result is always zero.

Scientifically, anecdotes and personal testimony are best when used to formulate hypotheses, but simply do not work for testing them. Even for the latter, an anecdote or collection of them is only useful under two conditions:

  • It (they) must be true.
  • It (they) must be representative.

Anecdotes are subject to a host of human failings, such as memory fallacies, logical fallacies, selective thinking, confirmation bias, perceptual construction, and embellishment over time, especially when frequently related, and even from the same witness.

The ‘evidence’ of personal experience can be insidiously deceptive, even trumping the need for more objective observation and documentation in the eyewitness’s mind. It is one of the principle ways people come to uncritically accept spurious claims.

This is why a court of law requires the use of other, better forms of evidence to validate the testimony of any witnesses involved in a case.

It’s important to bear in mind that one’s confidence in the accuracy of one’s memories, even ‘flashbulb’ memories is often far out of proportion to how accurate they really are. Memories can fade, warp and mutate with each recollection, becoming more exaggerated and meaningful to the teller and distorting their relation to reality.

Often used to promote medical or paranormal claims not supported by any other form of evidence, and many other pseudoscientific claims, the red flag of anecdotal testimonials should immediately set off one’s baloney detector about whatever products or services, or ideas, are being touted.

Any statement that you need or wish to be so, anecdotes can erroneously lead you to believe. That’s something to think about when you hear that a ‘reliable eyewitness’ swears that something is absolutely true, because ‘he saw it with his own eyes.’ While it is often said that seeing is believing, it’s even more true that believing is seeing.