The deception of “Studies say”

In my recent talk on Calling BS on Social Media Gurus, I pointed out how studies are frequently misrepresented by by experts, journalists and even the researchers themselves.

My point is not that you should never believe anyone. It’s just that you should ask some basic questions before you do.

Here’s a recent example:

The Washington Post reported on a study published last week by The University of Michigan about college students and empathy. The article is titled College students losing their sensitive side.

Their second paragraph makes this claim:

“A new University of Michigan study has found that since 2000, college students have become less empathetic.”

The problem is this isn’t quite what the research says.

And the Washington Post article is likely based on the press release from the University of Michigan (Non an uncommon thing, but a surprise to those who don’t know this).

The actual research is just a survey, much like the ones you avoid filling out if you can. The survey asks a bunch of questions about student’s own perceptions of their empathy. This is not the same as their actual empathy, which is hard to measure, unless during the study you can throw some wounded puppies their way and see how the students behave, an experiment hard to do for 14000 students.

Some headline stories, such as this one from The Globe and Mail, lead with the claim today’s students are 40% less empathetic than previous generations. What does that even mean?  That they’d let 40% more puppies die? No. It means on a survey 40% of them answered one radio box to the left, instead of to the right, or something similarly abstract and detached from a specific situation where empathy might apply.

So what the study actually says is this: students perceptions of their own empathy, as framed by a survey, is lower than previous generations.

Measuring someone’s perception is not the same as measuring their behavior. You might think you’re cool, but really you’re a jackass. Or vice-versa. But this article suggests perception of yourself and reality are the same thing (There is one referenced study that claims a correlation, but it’s behind a paywall, which I perceive as annoying).  It is entirely possible there are other reasons for the change in data. Perhaps students are more self-aware and honest in 2010 about their selfishness than previous generations? Seems possible. It’s certainly a question worth asking.

Konrath, one of the researchers is quoted as saying “We found the biggest drop in empathy after the year 2000… college kids today are about 40 percent lower in empathy than their counterparts of 20 or 30 years ago, as measured by standard tests of this personality trait.”

The key phrase is “standard tests of this personality trait”.  By standard, she probably means these surveys, and all surveys have known problems, biases and limitations (The specific personality test is apparently the Davis Inter-personality Reactivity Index). Standard tests are limited – very limited. They might be the best tools we have, but when a limited tool is used to make a general claim, it’s less based on science than opinion,  especially when a scientist is asked to explain “why”. The studies are rarely designed to explain why, but that doesn’t stop many of these articles and experts from theorizing on why.

And to be completely fair – even if there is good reason to make these claims, it doesn’t seem the journalists and reporters writing about these claims have done much work to verify that’s true. Also, everyone is entitled to an opinion. But quoting a scientist’s opinion is typically framed as science, rather than as opinion.

If nothing else, take the actual survey yourself – you’ll see what the students in the survey saw, and when you’re done you’ll be scored against the actual data. Cool. And I suspect you’ll feel more aware of what studies and claims might really mean.

For reference, the same basic story that appeared in the Washington Post also appeared on:

  • MSNBC
  • USnews (they did better ‘Today’s College Students More Likely to Lack Empathy’)
  • Psychology Today
  • LA Times (actually talks about the study)
  • SfGate – This opinion piece suggests our critical view of youth is perennial

But none provide a link to the actual paper presented last week. I dug around for a half-hour and found a PDF of their summary from Sara Konrath’s site. I’d really love to see a rule where any article referring to any study or research must include a link to the actual research.  It’s rarer than you’d think.

8 Responses to “The deception of “Studies say””

  1. Franke

    Couldn’t agree with you more, Scott. When I was a newspaper reporter, I always scrutinized studies first with three questions: 1) Who paid for the study, 2) What was the researcher’s(s’) methodology, and 3) How many actual participants were included in the study. And that was just the first round of questions. (Admittedly, this biased me into thinking that generally, research did not necessarily say what the studies claimed.)

    The use of the word “study” here is problematic, since it’s actually based on a survey, as you point out. “Study” implies some actual digging, whereas a survey does not.

    Reply
  2. Betawriter

    I fully agree with you too, Scott. From facts to what’s published in mass media there is a long way in the middle: the researcher’s point of view, the press release from the organization and the interpretation of the reporter.

    More often than not, “research” is pretty bad. If it can’t be measured and reproduced by third parties, it’s certainly not Science, but a lot of things get published because research and publishing is yet another business. Press releases generally exaggerate the study to publicize the institution. And mass media… well… the name of “Science” (a new kind of religion) sells news very well.

    You are very right: I’ve NEVER seen a link or even a mention to the complete title of the research article when reading the typical “a study says…”.

    Great post, thanks.

    Reply
  3. Scott Berkun

    Franke: I find it disappointing that those basic questions aren’t generally asked, much less answered.

    And the comment threads on web published reports like these generally don’t ask either.

    Reply
  4. Truls

    Thank you for pointing out one of my pet peeves. There are so many articles referencing studies without any links to the actual study. And like Franke points out, we need to know the facts about the study to be able to judge the validity of it.

    Reply
  5. Joe Scarpati

    If a link to the actual research was included in articles, what percentage of readers would click on it and actually read the article? (We could probably survey people to find out.) I don’t mean that as a rhetorical question, though. I’m curious as to whether or not most people would be interested in finding out the facts or taking the headline of the story and regurgitating it to their friends.

    As a consultant who does a lot of research, I couldn’t agree with you more. People are constantly misled by writers/researchers looking to make an interesting or sensational claim with little regard for methodology issues, sample bias, or any other problems that arise from studies.

    Reply
  6. Scott Berkun

    Joe:

    You’re right, most would not. But some will. And some of them will ask questions and make comments that add value.

    Perhaps more deceptive is to report on a study and not mention to readers that the reporter hasn’t read the study. Or not inform the readers than the study isn’t even available – and all that was reported on was a press release, and comments by those who made the study. This would take 30 seconds to do – one or two sentences, and would appropriately frame the entire story.

    “Old media” was just as bad perhaps, but anyone who claims new media and blogs are revolutionary in information quality has a lot of explaining to do.

    Reply
  7. Dave Cavanaugh

    Scott,

    You cobbled together a nice piece that succinctly sums a process sorely lacking in due diligence — a responsibility to accuracy. You’d swear that these so-called professionals are more interested crafting snappy link-bait rather than credible journalistic pieces.

    Albeit I should register surprise, I merely shrug my shoulders in acquiescence and move on. I clearly recall the headline, but never moved beyond title — I suspected the piece was rubbish and your deft analysis confirmed my suspicions.

    I spend a significant chunk of time reviewing studies and the subsequent interpretation in the “press” and I’m always astounded how opposing camps cherry-pick morsels or blatantly misrepresent data to support their version of the “truth”. One of my favorite ploys is when an “expert” presents a correlation (and a dubious one at that) as a cause and they get away with it — regularly. It is fun however, to remind the scoundrel that cause and effect must meet three requirements association, temporal antecedence, and isolation. They stumble and bumble trying to find a quiet place to hide, but Hume is a fine way to smoke the buggers out.

    Cheers.

    Reply

Leave a Reply to Scott Berkun

* Required

Click here to cancel reply.