Cashing in on fears of autism

Autism has been the subject of much controversy in recent years.  Rates of diagnosis are rising, and many parents-to-be are understandably worried.  Despite the efforts of many good scientists, we still don't have an explanation for most cases, although we do know that vaccines have nothing to do with it. Many unscientific claims are out there, some of them designed to take advantage of vulnerable parents who are desperate for answers. In this climate, any new claim to have found "the" cause of autism ought to be stated very, very carefully, and backed up by solid evidence.

Just recently, scientists at the UC Davis Mind Institute published a study that described 6 antibodies found in the blood of some mothers of autistic children, and much less often found in mothers of normally developing children.  They suggested that these antibodies somehow got into the brains of developing fetuses, causing autism in the children.  They even gave a name to this form of autism: maternal autoantibody-related, or MAR autism.

If true, this study suggests that a test for these antibodies might predict whether or not a child will have autism.

The study seems plausible, and it was published in a respectable journal called Translational Psychiatry.  Unfortunately, though, the study and the way it has been promoted are plagued with problems.

Perhaps the biggest red flag is that the two lead authors, Daniel Braunschweig and Judy Van de Water already have a patent on the proteins described in their paper, and Van de Water is involved with a company, Pediatric Biosciences (PBI), that is already marketing a test to predict autism based on this study.  Van de Water is the Chief Scientific Advisor for the company, which has licensed her patent for this specific test.  The company website claims
"To date, the test has demonstrated 100% accuracy - meaning if a mother or prospective mother has developed the antibodies, then her child will later be diagnosed with AU [autism] or ASD [autism spectrum disorder]."
Wow. This sounds incredibly accurate. This test is apparently the sole product of PBI, and they trumpeted the new study with a press release claiming that
"PBI is developing a diagnostic test based on these findings that will provide physicians with a reliable set of biomarkers for pre-conception … diagnosis of this Maternal Autoantibody-Related (MAR) form of autism."  
PBI is marketing the test to all "women over 30 who are at least 2 times more likely to give birth to an autistic child" as well as other women.

So, with all this hype, and despite the major conflicts of interest of the lead authors, did the study show what they are claiming?  Is this a test that mothers should take?

In a word, no.

First, the study doesn't even answer the right question.  The study looked at antibodies found in the blood of mothers some years after their pregnancy, not during or before pregnancy. Therefore it doesn't tell us whether a test of a pregnant mother will predict anything at all.  Post hoc analyses like this are notoriously inaccurate: for example, it's easy to find factors that predict every presidential election for the past 50 years, but many of them will perform poorly at predicting future elections.  Similarly, finding 6 proteins that distinguish some mothers of autistic children from mothers of normal children, all measured after the fact, tells us nothing about whether we can predict autism in advance.

Second, their striking claim of 100% accuracy is false.  (Their press release only claimed 99%: "A positive result would mean that they have a 99 percent likelihood of having a child with autism if they proceed with a pregnancy.")  But the study provides no such evidence.  What it did show was that 56 mothers (out of 246) had antigens to one of 12 combinations of 3 antigens that they selected.  [Aside: there are many other combinations of antigens that they did not report, and no indication that they properly adjusted their statistics to account for this.] But the study failed to say whether these 56 mothers also had non-autistic children.  If they did, then even the mothers in their own study had a much lower chance than 99% of having an autistic child.  Given that many families with autistic children also have other, non-autistic children, touting an accuracy number like this, without proof, is outrageous.

The study isn't completely worthless: it points to a possible link between antibodies in the blood of the mother and autism in a child.  If Van de Water were truly interested in the welfare of autistic families, she would be focusing on the obvious next step: test a large, randomly sampled collection of pregnant women for the same antibodies.  Then follow them up, over the next few years, to see if the test can predict the likelihood of having an autistic child.  Meanwhile, no one should be selling a test before studies can show that it has some predictive value.

But Van de Water and her friends at Pediatric Biosciences seem far more interested in making money off the fears of prospective parents.  PBI is planning to charge about $800 for their test, which they'll begin selling next year. Stanford University biostatistician Steven Goodman says that they are "peddling false hope that giving birth to autistic kids can be avoided."

A news article in Science last week raised many serious questions about Van de Water's study and its claims.  Several scientists quoted in that article questioned the premise of the study: that antibodies from the mother somehow get into the brain of the developing fetus. Yale scientist George Anderson said the data are too preliminary and the statistics too weak to support a clinical test (essentially the same argument I'm making here).

And what about that 99% claim?  In their interview with Science, Van de Water and Jan D'Alvise (president of PBI) said it
"was not meant to convey likelihood in the statistical sense, but rather the 99% accuracy with which the study demonstrated specificity of the biomarkers for ASD."  
This is unadulterated poppycock. As of this writing, the company website still claims that the test has "100% accuracy--meaning if a mother or prospective mother has developed the antibodies, then her child will later be diagnosed with AU or ASD [autism]."  That claim couldn't be any more clear - or any more wrong.

With all the controversy over the causes of autism, and with the medical community still struggling to correct the tremendous damage caused by Andrew Wakefield's fraudulent study linking vaccines to autism, the last thing we need is an erroneous claim that someone has found the cause of autism.  And it's far too soon to start offering moms a test that tells them they're going to have an autistic child.

Dr. Oz tries to do science again, this time with green coffee bean extract

Dr. Mehmet Oz hosts a popular TV show in which he promotes all sorts of medical treatments, some good and some - well, not so good.  And once in a while, he tries to do a science experiment, as he did in 2011 with a badly flawed experiment on arsenic in apple juice.

Well, Dr. Oz has done it again.  This time he wanted to re-examine a claim that he himself had made on an earlier show about green coffee bean extract.

In April 2012, Oz aired a segment on his TV show called "Green Coffee Bean Extract: The Fat Burner That Works!"  On it he "this miracle pill can burn fat fast, for anyone who wants to lose weight." Not surprisingly, sales of green coffee bean extract skyrocketed in response.
"A marketing apocalypse was ignited!" Dr. Oz pointed out in his show in September of 2012.  "I was surprised by the firestorm," he said.
Dr. Oz loves this topic, by the way. He's run dozens of shows on weight-loss gimmicks, such as "The New Silver Bullet for Weight Loss" in which he promoted a new diet pill called Qnexa, and "Ancient Ayurvedic Secrets to Lose Weight".  But let's leave those for another day.

One problem with Oz's first green coffee bean show was that he based it on a study that has some serious problems.  That study claimed that a particular brand of green coffee bean extract called GCA led to significant weight loss.  Subjects lost a lot of weight, too: 8 kilograms (over 17 pounds) on average.    Dr. Oz called it "a staggering, newly released study."  Wow, must be good, right?

Let's look at that study, shall we?  First, it only involved 16 people, a tiny sample.  There were 3 treatments: high dose GCA, low dose GCA, and placebo.  The subjects were divided into 3 even smaller groups, but not by treatment: instead, each group took all 3 treatments, for 6 weeks at a time, with a 2-week rest period in between.  The only difference between groups was the order of the treatments (high-dose/low-dose/placebo).  Subjects in all 3 groups lost about the same amount of weight.  What was the difference?  Well, the authors claimed that the amount of weight loss during the periods when the subjects were taking GCA was greater than when they weren't, even though they lost weight even during placebo treatment.

One critique of the study is there was no proper placebo control.  Looking at the paper, it is impossible to tell how much weight loss is being attributed to the green coffee beans rather than the daily monitoring of diet, which is known to help with weight loss.  And it's a really, really small study.

Perhaps a larger problem is that the trial was carried out in India, and then written up by a U.S. researcher, Joe Vinson from the University of Scranton, as revealed by a story in The Globe and Mail (Canada) last December.  That's right: the subjects were recruited in India, all data was collected there, and the data was emailed to Vinson so he could write it up.

Even more troubling was that Vinson was paid by the makers of GCA to write the study.  Worse yet, the paper states that "The authors report no conflicts of interest in this work." When asked about this by The Globe and Mail:
"Vinson said that he doesn’t gain financially if the company sells a lot of product and that the journal didn’t require him to disclose the relationship."
This small, badly run study was anything but "staggering", as Dr. Oz called it.  I have little confidence that the data sent to Vinson from India was even correct.

Maybe Dr. Oz might was worried too, because a few months after his original show, he ran another show in which he looked at green coffee bean extract again.  He said he was responding to criticism of his earlier show, and he wanted to set the record straight.  For his second show, "Green Coffee Bean Extract: The Answer to Weight Loss?" he ran his own experiment:
"For the first time, we are doing an unprecedented experiment," he said. "We're doing our own study, right here on this show.... the first of its kind EVER on television!"  
Oz's experiment involved 100 women - all of them in the studio audience for his show - who took either green coffee bean extract or a placebo pill for two weeks.  And the result?  I won't make you watch the video; here is the entire statement of results, from Oz's website:
"In two weeks, the group of women who took the green coffee bean extract lost, on average, two pounds. However, the group of women who took the placebo lost an average of one pound – possibly because they were more aware of their diet for that two weeks because of the required food journal."
On the show, Oz stated proudly: "green coffee bean worked for us."

Maybe Dr. Oz's science experiment was better than the Vinson study.  But that doesn't mean it was any good.  First off, Oz seems to have ignored some critical rules on how to run a experiment involving humans.  As Scott Gavura pointed out at the Science-Based Medicine blog, Oz's study "makes a mockery of good research methodology."  Oz failed to explain how the women were recruited for the experiment, and Gavura points out that apparently did not obtain the ethics board approval that all experiments on human subjects require.

Oz also seems willfully ignorant of the notion that 2 weeks is far too short a time to assess the value of a weight-loss treatment.  Will he go back to those same women a few months later to see if the effect lasted? Somehow I doubt it.

But what about that result?  The women who took the coffee bean extract lost 2 pounds, versus just 1 pound for the other group.  (Actually, thanks to Scott Gavura, we know that the difference was even smaller, just 0.76 pounds.) Oz provides no statistical analysis to demonstrate that is even marginally significant. Nor does he provide the raw data that would allow others to replicate his analysis, as he might have to do if he were actually to try to publish his study. But for Oz, what he described on his show seems to be proof enough.  That's is a poor excuse for science.

Meanwhile, sales of green coffee bean extract continue to climb.  My advice: save your money.  And the next time Dr. Oz runs a science experiment, be skeptical.