Pesticide exposure during pregnancy increases autism risk in the child? Not so fast…

If there’s one thing that antivaccine activists share in common, it’s the passionate (and as yet unproven) belief that “something” out there in the environment caused the “autism epidemic.” Usually, that “something” thought to be vaccines, but with the utter failure of the vaccine-autism hypothesis to the point where it is considered soundly refuted, antivaccinationists have gotten a bit more—shall we say?—creative. Now it’s something in the environment. Sometimes it’s mercury, despite the utter lack of evidence that mercury in vaccines is even remotely linked to autism. Sometimes it’s mercury from coal-burning power plants. Sometimes it’s pollution, even from China. Other times it’s pollution from freeways. Other times, it’s pesticides, acetaminophen, and antidepressants. Needless to say—but I’m going to say it anyway—the enthusiasm of autism cranks for these various causation hypotheses vastly outweighs any evidence in support of them, which is usually either really weak or nonexistent.

Well, here we go again.

I tried. I really did. I tried to let this cup pass, but it kept popping up on my Facebook feed. I kept seeing it on Twitter. Finally, I just had to say, WTF? If I can’t beat ’em, I’ll beat ’em over the head with science. I’m referring, of course to the latest study making the rounds claiming to find a link between autism and…something, anything! This time around, it’s pesticides, and here’s the CNN report on a study finding a link between pesticides and autism:

Scientists have long hypothesized that chemicals found in our environment play a role in causing autism. Research published this week in Environmental Health Perspectives supports that theory, finding children whose mothers are exposed to agricultural pesticides during pregnancy may be at increased risk for autism spectrum disorders, or ASD.

Researchers at the University of California, Davis, looked at the medical records of 970 participants. They found pregnant women who lived within a mile of an area treated with three different types of pesticides were at a two-thirds higher risk of having a child with ASD or developmental delays. These pesticide-treated areas included parks, golf courses, pastures and roadsides.

Before I get to the study itself, I couldn’t resist clicking on the link in the CNN story, which leads to another CNN story from 2011, Scientists warn of chemical-autism link. It’s basically a story that looks at all sorts of dubious studies linking the dreaded “chemicals” to autism. Very unimpressive. But what about this study? It comes from UC-Davis and Irva Hertz-Picciotto’s group at the Medical Investigations of Neurodevelopmental Disorders (MIND) Institute and appears as an advance publication in Environmental Health Perspectives. Sadly, it’s a thin a gruel as can be, a study that shows, basically, not much at all, and that’s being generous. At best, it might serve as a hypothesis-generating study, because it sure doesn’t nail down any risk between pesticide exposure and autism. Or not. In any case, I’ve encountered Irva Hertz-Picciotto before over at my not-so-super-secret other blog, although I wasn’t the one who wrote about her. Steve Novella was, noting that she said in a press release about an earlier study:

It’s time to start looking for the environmental culprits responsible for the remarkable increase in the rate of autism in California.

So right away, you know there’s a bias. Of course, every scientist has biases; that doesn’t necessarily mean she’s wrong about this study. Actually, there are lots of other reasons to think she’s wrong, or at least that, as I like to say when reviewing a scientific paper, the data do not support the conclusions made.

Before you can understand this paper, you need to know what the CHARGE study is, because that’s the study from which these data were reported. The CHARGE study is a case-control study of autism spectrum disorder (ASD) developmental delay (DD), and typical development consisting of 970 subjects in California. A case-control study, as you recall, is an observational study in which two existing groups are compared on the basis of an attribute thought to be causal for the outcome of interest. They’re often used to identify factors that might contribute to a medical condition by comparing people who have that medical condition (the cases) with a matched group of people who do not have the condition (the control group). For instance, it was a case control study by Sir Richard Doll in 1947 that was among the first to find strong evidence that smoking was associated with lung cancer, which was found by comparing subjects with lung cancer to those without and finding that many more subjects with lung cancer smoked. Groups are supposed to chosen so that they are as similar as possible except for the condition studied.

In any case, the control group should come from the same population as the cases, and the numbers of cases and controls don’t have to be equal. In this particular case control study of ASD and DD, there were three groups, 486 with ASD, 168 with DD, and 316 typical. So overall, the numbers weren’t really that large. All of the subjects in this particular study lived in California, within the catchment area for the MIND institute, defined as within a two hour drive from Sacramento.

Investigators then took advantage of a law in California that mandates that commercial application of agricultural pesticides be reported to the California Department of Pesticide Regulation (CDPR), which makes data publically available in the form of the annual Pesticide Use Report (PUR). The authors describe:

As described by CDPR, the pesticide use report data includes “…pesticide applications to parks, golf courses, cemeteries, rangeland, pastures, and along roadside and railroad rights-of-way. In addition, all postharvest pesticide treatments of agricultural commodities must be reported along with all pesticide treatments in poultry and fish production as well as some livestock applications. The primary exceptions to the reporting requirements are home-and-garden use and most industrial and institutional uses.”(California Department of Pesticide Regulation 2014)

Basically, the authors examined how close the mothers of the subjects of the CHARGE study lived to areas treated with pesticides during pregnancy and, using a complicated statistical model that examined distances from subject homes and pesticide application and tried to find correlations between “exposure” to pesticides by the mother during pregnancy and ASD or DD. The scare quotes are intentional, because using that word is dubious in the extreme. In fact, I could go into the detailed methodology more, delving into some of the analyses carried out. I don’t understand them all (not being a statistician), but if there’s one thing I’ve learned reviewing grants it’s that I don’t necessarily have to understand the detailed statistics when I understand experimental design, and this experimental design does not produce data that justify the conclusions of the paper. I could talk about how the authors looked at all sorts of different pesticides of different chemical classes and toxicities. I could, but I won’t—much. The reason is that this study is so fatally flawed from its very inception that there’s really very little point. If the overall design of the study is rotten, then all the details in the world won’t fix that.

Here’s what I mean. What are the hypothesis and the underlying assumption of this study? The hypothesis being tested is simple: Pesticide exposure during pregnancy is a risk factor for the child developing ASD or DD. However simple the hypothesis is, the problem is the definition of “exposure.” Clearly, the underlying assumption behind the study is that if a mother lived within a certain distance of pesticide applications during her pregnancy, that constitutes “exposure” to pesticides. So, when the authors conclude that children with ASD were 60% more likely to have had mothers who lived during their pregnancy within 1.25 km of pesticide application (and were therefore “exposed”) compared to typical children (adjusted odds ratio = 1.60, 95% confidence interval 1.02 to 2.51) and that mothers of children with DD were 150% more likely to have been “exposed” (adjusted OR=2.48, 95% CI=1.04-5.91), there’s a huge assumption being made that isn’t justified anywhere in the text with actual…oh, you know…evidence. Also notice the confidence intervals. These are the best confidence intervals the authors could find looking at different types of pesticides and different “buffer zone” distances, and they still come very close to failing to be even statistically significant (notice how the 95% confidence intervals almost include 1.0). I’m also pretty darned close to certain that the differences found are not clinically significant. (The two are not the same thing.)

The sine qua non of a good study demonstrating an association between an environmental exposure and a condition requires the actual verification and quantification of the environmental exposure under study in the cases. Sometimes this involves measuring the actual levels of the chemicals in question, either in the research subjects (ideally) to document exposure or in the places where they live and work. For instance, for a smoker estimating exposure can be done in a number of ways, from surveys to measuring cotinine levels in the urine or blood to verify exposure to tobacco smoke. Just living close to places where pesticides were applied does not mean there was an exposure. It just doesn’t. Pesticide use in California is heavily regulated (witness the reports to the state that were made publicly available that allowed the investigators to do this study in the first place.) Chances are that the amount of pesticide that made it into the air was negligible. We don’t know for sure, but there’s little reason to suspect that there were significant pesticide exposures over such distances from standard applications. I know that saying so will probably cause antivaccine activists to try to call me a shill for Big Chemical, to go along with my supposedly being a Pharma Shill, but such is life.

Table 3 is particularly revealing (and you can see it for yourself), where the authors looked at different chemical classes, different distances, different time windows of exposure between pre-conception, first trimester, second trimester, and third trimester. There seems to be no consistent pattern, no rhyme or reason. For instance, for some of the time windows, the adjusted OR increases with increasing distance! The distance-effect relationships are not consistent. The numbers seem a bit more consistent for DD, but the numbers are so small that few of the relationships reach statistical significance, an effect, given the number of comparisons, that looks consistent with random variation; i.e., “noise.”

The discussion section of the paper spends a lot of time looking at in vitro and animal studies in such a way as to distract from the utter unconvincing nature of the research findings just reported. Perhaps the biggest howler is this paragraph:

Several limitations to this study were unavoidable in the exposure assessment, potentially producing misclassification. Primarily, our exposure estimation approach does not encompass all potential sources of exposure to each of these compounds: among them external non-agricultural sources (e.g. institutional use, such as around schools); residential indoor use; professional pesticide application in or around the home for gardening, landscaping or other pest control; as well as dietary sources (Morgan 2012). Other sources of potential error include errors in reporting to the Pesticide Use Report data base, the assumption of homogeneity of exposure within each buffer, and potential geo-coding errors. Seasonal variation and address changes mid-pregnancy were accounted for by assigning an address to each day instead of one address for the individual, but information on hours spent in the home or elsewhere was not available.

Funny how the authors neglected the biggest limitation of them all: That they haven’t demonstrated that their assumption that living within 1.5 km of a source of pesticide application means that there was significant pesticide exposure is valid. True, the authors characterize their study as “exploratory” and are cautious with their language in their paper (although, I would argue, not cautious enough), elsewhere they let their true freak flags fly. For instance, in SFGate:

“This study validates the results of earlier research that has reported associations between having a child with autism and prenatal exposure to agricultural chemicals in California,” said lead study author Janie F. Shelton, a UC Davis graduate student who now consults with the United Nations. “While we still must investigate whether certain sub-groups are more vulnerable to exposures to these compounds than others, the message is very clear: Women who are pregnant should take special care to avoid contact with agricultural chemicals whenever possible.”

I would agree that pregnant women shouldn’t be handling industrial strength pesticides, but if there’s evidence that living within a mile or so of areas where pesticides are used during pregnancy will cause a woman’s child to develop autism, certainly neither Shelton nor Hertz-Piccioto has provided it, either in this paper or elsewhere. Sadly, that didn’t stop the press from dutifully responding to the press release from the MIND Institute as though this study were slam-dunk evidence that pesticide exposure during pregnancy causes autism. It’s not, not by a long shot. It barely qualifies as maybe hypothesis-generating evidence. Wait. Strike that. I don’t think it qualifies even as that.