CHICAGO - Children on medicine for attention deficit disorder scored higher on academic tests than their unmedicated peers in the first large, long-term study suggesting this kind of benefit from the widely used drugs.
Wow. WOW. I get more actionable information from porn.
1. The comparison isn't between kids getting meds and "unmedicated peers" but kids with ADHD who get meds, and kids with ADHD who don't get meds.
1b. "Both groups had lower scores on average than a separate group of children without ADHD."
2. The study indicates that the kids derived a benefit on test scores equivalent to 1/5 of an academic year, by 5th grade. That would be two months. (Still below non ADHD kids, though.)
2b. In order to derive this benefit, kids needed to be on the medications for about 3 years consecutively; in other words, they had to "learn" while on meds. Risk-reward?
Our objective was to determine if reported medication use for attention-deficit/hyperactivity disorder is positively associated with academic achievement during elementary school. CONCLUSIONS. The finding of a positive association between medication use...and test scores is important, given the high prevalence of attention-deficit/hyperactivity disorder and its association with low academic achievement.
You say, "the study did find that stimulants were effective. Wasn't that the whole point?"
So that's the kind of study analysis they talk about in medical school but don't bother to teach. See how awesome it is to look critically at the methodology of a study, differentiate clinical significance from statistical significance? (Never mind that the study produced nothing new.)
This kind of analysis is the intellectual equivalent of turning a gun sideways. Looks cool to anyone who's never actually held a gun, but dangerously unreliable when it matters most.
The question: what do the authors want to be true?
First of all, was this study really necessary, let alone important enough to end up in Pediatrics?
There are already plenty of studies examining, specifically, stimulants and school performance. Here are seven: 1, 2, 3, 4, 5, 6, 7.
I'll admit that this study is unique in that it is prospective and long, but do we need a unique, prospective and long study of what we already know? It isn't even important research in that it has been pretty much established that there aren't significant effects on academic performance overall in ADHD kids. So why bother doing this study?
Or, you might ask me: "why does this study, in particular, bother you?"
The author names aren't important here, it's their degrees that are important. 6 authors-- only one an MD. The rest are PhDs.
Do you think PhDs care about ADHD drugs? The study isn't about the efficacy of medications; it's about the validity of ADHD. "See? We're studying a medical problem. Can we get some grant money now?"
Don't send me back to my pirate ship yet. The authors are from the Petris Center, which receives funding to examine healthcare policy. They got $900,000 from NIMH to study this. Was it worth it? But if there's a million dollars out there to study something that could have been done with a review paper (or a blog post), then you're going to do it.
This is the basic problem with academic research. Covering the same old ground, over and over, focusing on whatever is institutionally (or politically) popular.
Given this kind of research, I have no expectation that any progress will be made in the "treatment of ADHD," let alone in improving anyone's academic performance. I am entirely confident, however, that this lack of progress will cost millions and millions of dollars.