?

Log in

No account? Create an account
 
 
09 August 2006 @ 10:39 am
Common sense isn't enough: why reporters need training in how research works  
I xposted my earlier gripe about bad reporting on research to journalists, where mswalter offered a comment about his own experience covering social research. In essence, he suggests that common sense is usually sufficient to evaluate the claims of the study. He seems like a smart and sensible reporter, and I don't doubt his intelligence or the good faith with which he approaches the task of informing his readers about the latest social research. He sounds like the sort who probably gets it right a lot more often than some I read.

However, as I noted in a reply to his comment, common sense ain't enough. I have pulled my comment out and am posting it here for those who don't track the journalists board.
_______________________
Please don't think I'm being harsh here, and please understand that I'm not picking on you. Even if you're guilty of all kinds of crimes on this issue, you're hardly alone. Your situation is the rule, not the exception.

The fact that reporters do as you did because it seems like common sense is why they need training. That's the problem with research - and social research in particular. What looks like common sense is often wrong. In this study, the kinds of kids who are likely to engage in sexual activity are also probably the kinds of kids who listen to sexually charged music. But it is the kinds of cultural factors that shape who they are up to that point that likely cause both.

Sure, there's a relationship, and the relationship points to something worthy of analysis. But the world of quantitative social research, where young professors are trying to do whatever they can to get published so they can get tenure, is prone to this type of abuse and has been for decades.

Worst of all, a reporter with no training is a researcher's bitch in any interview about the study. The release is going to make claims that the reporter has no tools to critique or analyze, and when they talk the reporter has no idea when he/she is being misled. I'm not even suggesting that the researcher is lying - usually they believe what they're doing is accurate and justifiable (and sometimes it is).

Let me offer an example. Say that in a period of three months you do stories on 10 social research studies that demonstrate a variety of interesting results. All are similarly structured to this one, and all produce common sense results. Now, having not seen the study I'm going to make an educated guess here that the main findings were demonstrated with a degree of confidence at the .90 level, which is the standard level of acceptable error in this kind of study. First off, if you have no training, you have no idea what I just said or why it matters.

Now, if this is the case, then what would you say if I told you that - all other complaints about what's wrong with this kind of study notwithstanding - the odds are that at least one of the studies you just reported on is fiction. The results are wrong. They found an effect where, in reality, there was none.

As a reporter, are you comfortable knowing that 10% of the time what you're reporting is erroneous? Because that's about the fact of the matter.

I say this as a guy with PhD in Mass Communication who had to study this way more than I wanted, and also as a guy who has published both quantitative and qualitative social analyses. Ad been a J professor. And who is currently a consultant dealing, on occasion, with professional applications of both quant and qual research. There are plenty out there who'd disagree with me - you can find them in any "social science" department in America where the ideology of quant research holds sway.

Your readers would be better served - regardless of what the study was - if you'd been subjected to the kind of old school training I had as an undergrad (although that training was a result of my psych major, and I was being trained to DO, not analyze as a journalist - not sure what the prospective reporters at Wake Forest were being required to take back then). It may have turned out that there WAS some validity to the study, but we have no way of knowing that if the reporters aren't trained to analyze them.

 
 
Now Playing: "This Time" by INXS
 
 
 
thesandsabrase on August 9th, 2006 03:57 pm (UTC)
I have noticed that both reporters and social researchers that reporters cover miss one crucial fact: correlation is not causation.
(Anonymous) on August 9th, 2006 04:21 pm (UTC)
Actually, they don't even know the difference between "correlation" and "relationship" and that relationship, also, is not causation.

JSO
thesandsabrase on August 9th, 2006 04:25 pm (UTC)
Yes, that's a valid point.
_candide_ on August 10th, 2006 12:14 am (UTC)
Considering that people ignore the data from something like physics, are you at all surprised that they totally screw up one of the "soft" sciences?

I've found, in my vain delusional attempts at science-education, that people only hear that part of the explanation that supports their beliefs, and ignore the rest. Reason is dead. Data is meaninless. Belief trumps all.
Samlullabypit on August 10th, 2006 12:16 am (UTC)
Considering that people ignore the data from something like physics, are you at all surprised that they totally screw up one of the "soft" sciences?

I apologize profusely if I inadvertently communicated "surprise."
_candide_ on August 16th, 2006 01:29 am (UTC)
:snerk: