Fukushima Disaster & Radiation Poisoning | Natural Health Newsletter

Fukushima…One More Time

Back in December, a press release1 Press Release “Medical Journal Article: 14,000 U.S. Deaths Tied to Fukushima Reactor Disaster Fallout.” PR Newswire. 19 Dec 2011. <http://www.prnewswire.com/news-releases/medical-journal-article–14000-us-deaths-tied-to-fukushima-reactor-disaster-fallout-135859288.html> was broadcast to the media about a peer reviewed study conducted by Joseph J. Mangano and Janette D. Sherman (M&S) that claimed that at least 14,000 people (with a special focus on children under the age of one) had died in the U.S. during the 14 weeks following the Fukushima disaster.2 Joseph J. Mangano and Janette D. Sherman. “An Unexpected Mortality Increase in the United States Follows Arrival of the Radioactive Plume from Fukushima: Is There a Correlation?” December 2011. International Journal of Health Services. (Accessed 8 Jan 2012.) <http://www.radiation.org/reading/pubs/HS42_1F.pdf> At the time, I ignored the study for several reasons.

  • The peer reviewed journal in question is not that notable — ranking 2883 for its impact within the scientific community among all scientific journals, and 29th within its own category of Health Care Sciences and Services.3 Journal-Ranking.com. (Accessed 8 Jan 2012.) <http://www.journal-ranking.com/ranking/listCommonRanking.html?selfCitationWeight=1&externalCitationWeight=1&citingStartYear=1901&journalListId=336> (That doesn’t mean it’s a bad journal — just that other researchers don’t cite it much.)
  • The fact that the study results were broadcast through a press release is “odd” to say the least.
  • Not many media outlets carried the story. The Sacramento Bee did, but not many others. (The story is now curiously missing from the Bee’s website.)
  • No one wrote into the Foundation asking about it.
  • And radioactive iodine and cesium just don’t kill that quickly unless the doses are massive. It made no sense.

But recently, things have changed a bit. Several significant alternative health websites have picked up on the study and are trumpeting its “peer reviewed” credentials. What this means, of course, is that questions are starting to stream into the Foundation — questions that now must be answered.

However, before I address the study itself, I need to mention something. I have spent six of the last seven newsletters trashing peer reviewed studies from the medical community that disparaged nutrients and concepts strongly supported in the alternative health community: vitamins E and D, niacin, and that weight gain really results from what you eat, for example. It would be, at the very least, hypocritical to suddenly exhort the virtues of a peer reviewed study simply because its conclusions are in line with beliefs held by many of my associates, without at least putting the study under the same scrutiny that I have exercised when evaluating mainstream studies. And as it turns out, when you actually do that — put the study in question under a magnifying glass — it doesn’t stand up.

That said, let’s look at the study and explore some of the problem areas in it.

Examining the Fukushima “death” study

As Michael Moyer in his Scientific American blog4 Michael Moyer. “Researchers Trumpet Another Flawed Fukushima Death Study.” 20 Dec 2011. Scientific American. (Accessed 8 Jan 2012.) <Researchers Trumpet Another Flawed Fukushima Death Study> points out, the study provides no evidence for one of its fundamental assertions: that significant levels of Fukushima fallout arrived in the United States just six days after the reactor meltdown. The study’s authors provide no evidence for this assertion, nor any citation to back up their facts. Moyer also points out that even the study’s authors acknowledge that the U.S. Environmental Protection Agency’s monitoring of radioactivity in milk, water, and air in the weeks and months following the disaster found relatively few samples that demonstrated any measurable concentrations of radioactivity. But undeterred by this lack of supporting evidence, the authors pull an unsupported conclusion out of their hats and state that “clearly, the 2011 EPA reports cannot be used with confidence for any comprehensive assessment of temporal trends and spatial patterns of U.S. environmental radiation levels originating in Japan.” Or as Moyer sums up their conclusion, “In other words, the EPA didn’t find evidence for the plume that our entire argument depends on, so ‘clearly’ we can’t trust the agency’s data.”

Moyer goes on to argue about how the study relies on “sloppy statistics” by using data from 122 cities, 25 to 35 percent of the national total, to project data for the entire country. And he further indicates that others have accused the authors of cherry picking the data. But this all misses the key point: that the data itself is bogus — cherry picked or not. And here we find that SpunkyMonkey in the Physics Forums (a favorite haunt of nuclear engineers) did a great job of exposing this major flaw in the study.5 SpunkyMonkey. 24 Dec 2011. Physics Forums. (Accessed 8 Jan 2012.) <http://www.physicsforums.com/showthread.php?t=562587> And I quote:

“Immediately seeing major problems with that study by Mangano & Sherman (M&S), I asked a statistician what he thought of it. He crunched the data and while he found several devastating statistical problems, his most remarkable finding was that the U.S. infant-death data M&S report as being from the CDC does not jibe with the actual CDC infant-death data for the same weeks.

“The M&S infant-death data allegedly from the CDC can be seen here (go to Table 3, page 55). And the actual CDC infant-death data can be seen here (go to Locations, scroll down and select Total and press Submit for the data; the data for infants is in the Age column entitled “Less than 1”). The mismatching data sets are included at the end of this post, and with the links I’ve provided here, everything I’m saying can be independently confirmed by the reader.

“Here are the mismatching data sets, note that post-Fuku weeks 15 through 24 do match:

(Note: this is not the actual chart on SpunkyMonkey’s post. His didn’t adjust for the exclusion of Ft. Worth, New Orleans, and Phoenix. M&S state in their report that they pulled those cities from the data since they did not consistently report data for the required timeframes. In truth, it didn’t change things very much. Nevertheless, I did that so we’d have an apples-to-apples comparison. I also included a comparison of the data for the previous 12 months — as that also plays a key role in “proving” M&S’s premise, and the discrepancies are even more egregious.)

“The nature of the mismatch is that all the pre-Fukushima M&S data points are lower than the actual CDC data points and bias the data set to a statistically significant increase in post-Fukushima infant deaths. But in the actual CDC data, there is no statistically significant increase. The statistician also found that even M&S’s data for all-age deaths was in fact not statistically significant, contrary to the claim of M&S.”

(Note how the M&S data and the adjusted CDC data actually tend to match number-for-number post-Fukushima, which pretty much confirms that we’re all working from the same CDC database. Also you will note, the identical sort of discrepancies hold true for the data presented for the previous 12 months — again biasing the data to a statistically significant, but unsupported, increase in deaths from one year to the next.)

“Why the infant data are mismatched is not understood at this time. However, a review of the archived copies of the Morbidity and Mortality Weekly Report archive finds that the historically released data points for the weeks in question jibe with the CDC’s MMWR database. So I see no reason to believe the CDC’s online data are not the true data.”

. . .

The bottom line is that it looks as though the alternative health writers who jumped onboard the Mangano & Sherman Report and pitched its shocking conclusions to their followers were tricked into buying the proverbial pig in a poke. That’s not to say that the study’s conclusions aren’t necessarily true — only that this study doesn’t prove it.

On the other hand, my gut feeling is that the study’s conclusions are not true. There is no compelling evidence that any significant radioactivity made it to the United States. There is no evidence that any noticeable numbers of adults or children in the United States died from anything that could conceivably be connected to exposure to radioactive iodine and cesium. We’re talking about a sudden surge in thyroid cancer (radioactive iodine) and radiation poisoning (cesium). There’s just no evidence that happened. And if there were exposure, it is far, far more likely that any significant increase in deaths would not be seen until several years down the road. Even people who lived near the reactors and received much higher doses of radiation exposure than could possibly have occurred in the U.S. have not started dying yet — certainly not in any noticeable numbers, let alone by the thousands.

Also, even if the numbers used in the study hadn’t been so squirrely, basing conclusions on non-specific increases in deaths is a highly questionable methodology. You’re right back to a variation of the problem encountered in the infamous flu vaccine cohort studies. Or to put it another way, if you accept the study’s premise, you also have to buy into the idea that radiation exposure from the Fukushima plant has caused an increase in deaths in the United States from things such as:

  • Heart disease.
  • Lightning strikes.
  • Food poisoning.
  • Flu.
  • Complications from taking prescription drugs.
  • Etc.

It boggles the mind.

Conclusion

I know many people live and die on the contents of peer-reviewed studies, but I’m not so sure that’s such a good idea. Until a study has actually been replicated several times, all peer reviewed means is that contemporaries have read the study and concluded that it “looks” good and “seems” to follow proper scientific procedures. The reviewers don’t actually verify the accuracy of the data or replicate the experiments. They will never dig into the data behind the study to see if it is accurately presented. They just assume it is. They’re not evaluating data; they’re evaluating methodology and presentation. Again, if there is a problem with the data itself, that only comes to light if someone tries to replicate the study, or if someone, in this case, SpunkyMonkey and his statistician friend, decide to spend time and peek behind the curtain. In addition, as I’ve pointed out many times before, there are numerous places for bias to slip into a study and totally distort its conclusions.

Does that mean that peer-reviewed studies are useless? Not at all! But they (both those that are pro medicine and those that are pro alternative health) need to be taken as what they are:

  • Conducted by human beings with all of their faults.
  • Subject to bias.
  • Sometimes subject to cheating.
  • Based on the best available information…we have at this time.
  • Often politically driven.
  • Often agenda driven.
  • Often dollar driven — as when they are sponsored by pharmaceutical companies.
  • And often based on previous studies that were flawed. In legal terms we would call this new study “the fruit of the poisonous tree.” That is to say: any conclusions in the new study drawn from a previous flawed study are, by definition, flawed themselves.

References

References
1 Press Release “Medical Journal Article: 14,000 U.S. Deaths Tied to Fukushima Reactor Disaster Fallout.” PR Newswire. 19 Dec 2011. <http://www.prnewswire.com/news-releases/medical-journal-article–14000-us-deaths-tied-to-fukushima-reactor-disaster-fallout-135859288.html>
2 Joseph J. Mangano and Janette D. Sherman. “An Unexpected Mortality Increase in the United States Follows Arrival of the Radioactive Plume from Fukushima: Is There a Correlation?” December 2011. International Journal of Health Services. (Accessed 8 Jan 2012.) <http://www.radiation.org/reading/pubs/HS42_1F.pdf>
3 Journal-Ranking.com. (Accessed 8 Jan 2012.) <http://www.journal-ranking.com/ranking/listCommonRanking.html?selfCitationWeight=1&externalCitationWeight=1&citingStartYear=1901&journalListId=336>
4 Michael Moyer. “Researchers Trumpet Another Flawed Fukushima Death Study.” 20 Dec 2011. Scientific American. (Accessed 8 Jan 2012.) <Researchers Trumpet Another Flawed Fukushima Death Study>
5 SpunkyMonkey. 24 Dec 2011. Physics Forums. (Accessed 8 Jan 2012.) <http://www.physicsforums.com/showthread.php?t=562587>