ISP #25 – What the F#@K are we Talking About?

From last June! We’re so timely!! Here are the show notes…

Continue reading

ISP #19 – Potluck Skepticism

As mentioned in the last post, this show actually came before episode 18. But that doesn’t matter. Time is all wibbly-wobbly anyway.

In this episode:

  • McElroy reviews “God’s Not Dead” (here’s the Reasonable Doubts episode about it, too)
  • Erno talks about woo he discovered walking around his town in Finland
  • Bohler talks about the stuff he’s been researching, especially the history of civilization and how quacks cherry-pick evidence
  • We discuss the Burzynski Clinic and its record of failure and pseudoscience

No show notes this time – we were just winging it!

YUM.

YUM.

Irreverent Skeptics Podcast Episode 18 – Science, Bitches!!

We’re switching the order of things up a bit. On April 19 and April 26, 2014, we did two ‘potpourri’ episodes, where we all came up with our own topics to talk about centered around a vague general subject. In the first episode… which was actually the second episode… after a quick discussion of how the internet might be able to give you PTSD, we talked about science stories we found fascinating. McElroy talked about the Higgs boson, and how it gives things mass. Bohler talked about South American civilizations and why the Ancient Aliens people are assholes. Jon talked about the concept of emergence, and how it has broad applications in many fields. And then… McElroy pretended to be a presuppositionalist apologist, and pissed Jon off immensely. Check out the full show notes!

Two notes: Yes, sadly, the Colton Burpo twitter feed turned out to be a parody. And no, the next won’t be about scary ways to die… the next show will actually be the April 19 show. You have one more episode in the meantime before we scare you to death!

YES PLEASE I WOULD LOVE TO KNOW MORE

YES PLEASE I WOULD LOVE TO KNOW MORE

Continue reading

Help make skeptical and scientific content accessible!

According to the WHO, 5% of the world’s population – about 360 million people – are deaf or hearing-impaired. That’s a huge audience that isn’t being served if your videos aren’t captioned. There are also, obviously, lots of skeptics who don’t speak English but who would love the material as well.

To better serve these segments of the skeptical community, a few groups have started projects to subtitle skeptical and scientific videos, through services like Amara.org. Tim Farley, over at the Skeptools blog, has written up a post summarizing a few of the efforts that are ongoing. If you think you might be able to help out, even with just a video or two, surf on over and check it out. Find something you’re interested in and get captioning!

Irreverent Skeptics Podcast Episode 17 – I Ain’t Afraid o’ No Ghosts

On April 12, 2014, we talked about SPOOKY GHOSTS! McElroy related his experiences on a ghost hunt and in a seance, Bohler talked about the tools of the trade, Jon described some of the “world’s most haunted places,” and more. Take a listen! Also check out the outtake for this episode… we had a bit of a false start.

No, seriously. Ghosts.

No, seriously. Ghosts.

Continue reading

Irreverent Skeptics Podcast Episode 16 – It’s a Good Day to Diet

On March 29, 2014, the gang talked about dieting – fad diets, diets that work, diets that could kill you, and so on. We also talked about what actually worked for all of us and why some diets just aren’t up to snuff. Read on for the show notes! We also have a couple of outtakes: here’s the first and here’s the second.

For best results, never eat this.

For best results, never eat this.

Continue reading

The wheat and the chaff: finding good science writing among the bad

I find it an interesting exercise to look at the difference between scientific and popular coverage of news stories about science.

BRAINS. for using, not eating.

BRAINS. For using, not eating.

For example, a bunch of stories have popped up recently about an experiment using transcranial magnetic stimulation to improve people’s memory. The technique sends an electromagnetic pulse into a targeted part of a person’s brain. In this experiment, researchers sent a pulse into a part of the brain that was connected to the hippocampus, a part of the brain related to associative memory that is typically not accessible without surgery, to see if the connections would transmit the pulse and have some effect on the hippocampus’ function. The results were that, after five 20-minute sessions (one per day), there was “increased functional connectivity among distributed cortical-hippocampal network regions and concomitantly improved associative memory performance”. Basically, they were given a memory test consisting of a set of arbitrary associations between faces and words that they were asked to learn and remember; those who received the placebo had no improvement, and those who had the real treatment did. However, these improvements only lasted for about 24 hours after stimulation. Various online news outlets have covered this in different ways. The most complete one I found was Science magazine, which has the entire study. But it’s highly technical and most people probably would zone out trying to read it. Research papers aren’t for everybody, of course. The one I found most accessible to the layman was from Popular Mechanics. They go into some possible concerns with the design of the study, including that there is a definite detectable difference between the placebo treatment and the real treatment. From Michael Fox, a neuroscientist at the Harvard Medical School who was not involved in the study:

“The real [TMS] causes a contraction on the scalp muscles, almost like somebody is tapping on your head,” he says. “The sham [TMS] feels much, much weaker. After having both, I’m fairly sure you could tell which one you had.” Unfortunately, as Fox points out, the scientists did not ask their participants which procedure they thought they had, which leaves a bit more room for a placebo effect to account for some of their results.

I didn’t see any mention of the duration of the effects in this article, however. The Newsweek article is kind of ridiculous, confusing transcranial magnetic stimulation for “using a powerful electromagnet to shoot electricity into a person’s head”. They also don’t appear to understand the difference between an MRI, which is all about investigating structure, and an fMRI, which is about investigating blood flow in the brain as an analog to mapping brain activity. They repeatedly make references to electroconvulsive therapy, which is totally irrelevant. No mention of the duration limitations. Even Medical Daily made the same false association in an article about a different TMS study:

What once was called “electroshock therapy” in the sordid halls of psychiatric institutions appears ready to make its return, though the newest incarnation of brain stimulation therapy is much more refined and much more targeted.

TMS is not electroconvulsive therapy. It’s a magnetic therapy that induces a small electric current in the wiring of the brain. According to the National Institute of Mental Health, TMS

uses a magnet instead of an electrical current to activate the brain. An electromagnetic coil is held against the forehead and short electromagnetic pulses are administered through the coil. The magnetic pulse easily passes through the skull, and causes small electrical currents that stimulate nerve cells in the targeted brain region. Because this type of pulse generally does not reach further than two inches into the brain, scientists can select which parts of the brain will be affected and which will not be. The magnetic field is about the same strength as that of a magnetic resonance imaging (MRI) scan.

(The variant of TMS used in this new research – repetitive transcranial magnetic stimulation – has been tested as a treatment for all sorts of disorders, including migraine, stroke, Parkinson’s disease, dystonia, tinnitus and depression.) As I browsed through these various articles, I was struck by the different rhetorical techniques the authors used to try to snag the readers’ interest. Some started off their articles with gruesome bits of history, like Newsweek’s (misleading) mention of ECT. Others tried humor, making jokes about refrigerator magnets and metal plates in the head. Lots of articles had a basic summary of the experimental procedure, which was good, but few went into detail about how the study was blinded and placebo controls were used. As skeptics, I think it’s important for us to recognize that there’s often much more to any science story than a single source will tell us. It’s vital to sift through the information heap to find common threads and hard data, so that we can be sure we’re not just falling for one writer’s spin on things. Modern media is all about getting and retaining readers, and when you have a market crowded with sensationalist clickbait articles and pithy 5-second sound bites, it’s getting harder and harder for an outlet to stay relevant when they just want to report the facts. A few tips for the intrepid skeptics who want to be sure they’re getting the real story:

  • Seek out multiple sources. Don’t take any single source as authoritative. Some sites will play fast and loose with the facts and make connections that aren’t really appropriate (as seen in the Newsweek example above).
  • Give more credence to the original study or the actual source of the science. For example, if NASA scientists say they’re investigating what appears to be a physically impossible propulsion system, try to get the real story from them. Don’t just look at what someone like Buzzfeed says about it; they’re paid to make things seem fantastic and bizarre, and might make leaps that they shouldn’t (e.g., saying that “scientists are baffled” when they’re really just intrigued).
  • See if they link to the abstract of the study. If an article makes sure you know where to find the original information, they’re probably not trying to pull a fast one on you. Sure, it’s always possible that they’re doing it just to make it seem like they’re being transparent, but generally it’s safe to give someone the benefit of the doubt if they’re willing to cite the data. Just be sure that you don’t make this rule absolute; your best bet will still be the original, in case the writer is trying to put a spin on it.
  • Remember that an article about a single study is usually preliminary, and if the sample size is small, you should be all the more cautious about accepting the results as conclusive. Science is all about getting confirmation and reproduction of your results, and no one study is ever enough to say that the results are absolute. There’s an unfortunate trend in science writing that journalists tend to make a big noise about amazing new studies with surprising results, but if and when those results are later contradicted, not much gets said about that.
  • When the study talks about an effect without a known mechanism, be especially skeptical until the results are repeated several times. You might be dealing with a study where some unknown confounding factor is involved, like an unexpected influence that wasn’t being controlled for. This may actually be the case with the rTMS study I discussed: the Time article quotes study author Joel Voss as saying that the research “was more of a hunch than I’d like to admit.” The researchers clearly expected that ‘syncing up’ several regions of the brain that work together with the hippocampus to produce associative memories would improve their function, but it’s not necessarily immediately obvious why the induction of electrical activity would make them sync up. (Then again, I’m no neuroscientist; there might be some underlying concept here I’m just not familiar with.)
  • On that note: be aware of your own limitations. If you read about a concept that you don’t understand particularly well, try to seek out the thoughts of someone with more expertise. Chances are that, unless you’re reading a study itself, the person writing the article probably isn’t an expert either, and taking the article at face value might just be compounding the author’s inexperience with your own to confuse things even more. Remember, good skeptics are always skeptical of their own thought processes as well; nobody can fool you quite as good as you can.
  • If a study itself doesn’t seem to pass the sniff test, don’t be afraid to look into other work done by the researchers to see how it has been received by other experts in the field. For example, if a study about vaccine safety seems to be coming from way out in left field, and you see that the author is (for example) someone who believes that vaccines cause autism and that autism can be treated through chemical castration, you’re pretty safe being skeptical of the research.
  • Watch for ideological red flags. If someone brings up an irrelevant factoid in a science article that seems like it’s designed to make you respond one way or the other to the content, they may be trying to spin you. An article about global warming that talks about Al Gore not being totally green-friendly, for instance, is probably not the most reliable source. This isn’t necessarily a disqualifier, because someone can still relate good science in the midst of an ideological screed, but it’s best to ignore the distractions and just stick to the meat of the subject.
  • Keep a look out for false balance. If a subject is new or controversial (politically, socially, or scientifically), writers might seek out an opposing viewpoint just for the sake of appearing unbiased – even if that opposing viewpoint is on the extreme fringe. I’m reminded of articles about new fossil discoveries that explicitly sought out commentary from Ken Ham. If you see this kind of thing happening, there’s a decent chance that the author isn’t familiar with the mainstream viewpoint or doesn’t really care about giving representation to a view in proportion to how accepted it is.
  • Look at the ads. (Unless you’ve got them blocked, of course.) Google ads can be especially useful, since they tend to be tailored to the content of the site. If you’re on a site you’re not familiar with, the ads might give you a hint about what sort of things are usually posted there. If you see things about fad diets, alternative medicine, esoteric and fringe science concepts, and so on, you may want to be extra-skeptical!

Now… go forth and read critically!

Irreverent Skeptics Podcast Episode 15 – Who are we? And how did we get here?

On March 22, 2014, Jon, the two Mikes, and Erno had a chat about their backgrounds with regard to religion, science, and skepticism, and how the podcast itself came to be.

Special guest Adam Reakes, host of the Herd Mentality Podcast, joined Jon, Brandi, and Erno to talk about his show and share some good news about how the Australian Vaccination Network, a group that spreads misinformation and helps spark outbreaks of preventable diseases in Australia, faced some serious legal challenges. We also discussed how insurance covers woo-woo treatments such as chiropractic and homeopathy, and how this gives them an unearned air of legitimacy. Adam also had exciting news about a project he’s working on that every exorcist is sure to love!

The intro music for the interview is “Go” by Guineo (David Ramírez and Avelino Herrera). This work is licensed under a Creative Commons Attribution-NonCommercial 2.5 Generic License. Find the original at http://guineo.atlantes.org/guineo%20-%20go.mp3.

Continue reading