among patients served by Kaiser Permanente who were seeking diagnosis.
This article by Stephanie O’Neill, via Southern California Public Radio, shows a common problem with science and medical reporting. This article, at least, includes a link to the study, full text, so you can see the methodology and results instead of just the abstract.
You don’t really need to read the whole thing to see the giant flaw, though. It’s right here in the second paragraph of the article:
The study of nearly 850,000 patients ages five to 11, who were seen at Kaiser’s Southern California hospitals, found a 24 percent jump in the number of children diagnosed with ADHD fro 2001 to 2010.
Not random, not controlled, not blinded in any way at all. Of course rates are going to look like they’ve jumped dramatically if your population for the study includes only people who came in requesting diagnoses for things. There’s some decent generic information in the study, there are some population concentrations that might be interesting to look at in a better-designed piece of research, but that’s not showing up in the headlines.
No, everyone in the media and on the internet is clutching their pearls about this dramatic increase. Cue the storm of conspiracy theorists insisting that this is proof that ADHD is an imaginary problem created by Big Pharma. Watch the comments of people saying it’s reflective of our horrible society that doesn’t discipline children properly and/or lets them watch too much TV and play too many video games. Note the absence in any of these reports of the obvious flaw in the methods, or even the mathematical acumen to figure out that this “dramatic jump” actually means 1-2 more children out of every hundred.
*sigh*