However, as I noted in a reply to his comment, common sense ain't enough. I have pulled my comment out and am posting it here for those who don't track the journalists board.
Please don't think I'm being harsh here, and please understand that I'm not picking on you. Even if you're guilty of all kinds of crimes on this issue, you're hardly alone. Your situation is the rule, not the exception.
The fact that reporters do as you did because it seems like common sense is why they need training. That's the problem with research - and social research in particular. What looks like common sense is often wrong. In this study, the kinds of kids who are likely to engage in sexual activity are also probably the kinds of kids who listen to sexually charged music. But it is the kinds of cultural factors that shape who they are up to that point that likely cause both.
Sure, there's a relationship, and the relationship points to something worthy of analysis. But the world of quantitative social research, where young professors are trying to do whatever they can to get published so they can get tenure, is prone to this type of abuse and has been for decades.
Worst of all, a reporter with no training is a researcher's bitch in any interview about the study. The release is going to make claims that the reporter has no tools to critique or analyze, and when they talk the reporter has no idea when he/she is being misled. I'm not even suggesting that the researcher is lying - usually they believe what they're doing is accurate and justifiable (and sometimes it is).
Let me offer an example. Say that in a period of three months you do stories on 10 social research studies that demonstrate a variety of interesting results. All are similarly structured to this one, and all produce common sense results. Now, having not seen the study I'm going to make an educated guess here that the main findings were demonstrated with a degree of confidence at the .90 level, which is the standard level of acceptable error in this kind of study. First off, if you have no training, you have no idea what I just said or why it matters.
Now, if this is the case, then what would you say if I told you that - all other complaints about what's wrong with this kind of study notwithstanding - the odds are that at least one of the studies you just reported on is fiction. The results are wrong. They found an effect where, in reality, there was none.
As a reporter, are you comfortable knowing that 10% of the time what you're reporting is erroneous? Because that's about the fact of the matter.
I say this as a guy with PhD in Mass Communication who had to study this way more than I wanted, and also as a guy who has published both quantitative and qualitative social analyses. Ad been a J professor. And who is currently a consultant dealing, on occasion, with professional applications of both quant and qual research. There are plenty out there who'd disagree with me - you can find them in any "social science" department in America where the ideology of quant research holds sway.
Your readers would be better served - regardless of what the study was - if you'd been subjected to the kind of old school training I had as an undergrad (although that training was a result of my psych major, and I was being trained to DO, not analyze as a journalist - not sure what the prospective reporters at Wake Forest were being required to take back then). It may have turned out that there WAS some validity to the study, but we have no way of knowing that if the reporters aren't trained to analyze them.