How mind-numbing numbers whip up fear
- Ivo Vegter
- 20 Aug 2013 01:12 (South Africa)
We’ve all seen the headlines. A fraud or accident costs millions, or billions, or even trillions, and it doesn’t really matter which because they’re all scary. Or exposure to some contaminant or another causes a certain scary percentage increase in disease. The simplest cases involve headlines with large numbers. It is hard to visualise something larger than one’s immediate surroundings. When we say the world is overpopulated, for example, what do we mean?
Sure, there are seven billion people on the planet, but how many people is that, really? Is it too many? If so, why? What percentage of the Earth's surface would they need to occupy to be a problem, in your estimation?
If you put the entire world’s people into Gauteng, each person would have a space of 2.6 square metres – elbowroom to spare. If you used the entire country, every single person could have 174 square metres to themselves. And the rest of the world would be empty, of people at least. As for the amount of land needed to grow food for all these people, this appears to have peaked on average.
The Population Research Institute has created a great video series that puts the scary big numbers of overpopulation claims in context. It is instructive not only in its political argument, but in the light it shines on statistics.
Here’s another example of shoddy statistics. During the media coverage of the Deepwater Horizon accident, reports were all over the media that within six months, scientists collected 8,000 birds “because of the BP oil spill”. The number was described as “staggering”.
Now it is true that they found 8,000 more animals than they usually find in the region, but then again, beachcombing scientists don’t usually descend on the Gulf Coast looking for dead animals to blame on oil companies. The example linked above notes double counting of some 10% of the sample. It might also have noted that according to the National Wildlife Federation, which uses a total of 7,000, less than half of the animals collected were visibly oily. And not one media story I have seen since the accident more than three years ago ever bothered to mention the total bird population of the Gulf region. Sure, 8,000 dead birds is more than anyone wants to see, but one can only understand its impact in context of the total population. If there are only 10,000 birds, it’s a catastrophe. If there are 10-million, not so much.
The shrewd reader will note that there is no reason to believe that the 8,000 number really does indicate a low number of deaths. It can easily be inflated by guessing at the number of affected animals found as a share of the total population. This is why the number for the Exxon Valdez spill, for example, stands at a surprisingly high 225,000 affected birds. They never found that many, of course.
The irony of such superficial “facts” is that it can make problems look both more trivial and more severe than they really are. It can be difficult to conclude anything sensible, either way, from big numbers completely devoid of context.
Of course, scary context-free numbers aren’t limited to advocacy issues. For example, Microsoft recently lost $900-million on its Surface tablet. This sounds horrible, but it’s hard to evaluate without also knowing that revenue for the quarter rose 10% year-on-year to almost $20-billion. Net income was $5-billion, which is rather better than the $500-million loss it had in the same quarter last year. Then it took a $6.2-billion hit on goodwill impairment on an acquisition it has come to regret. And if $900-million lost in a quarter still sounds like a lot, consider that Knight Capital Group managed to lose half as much in 30 minutes last year.
A similar trick applies to arguments about executive pay. The millions paid to the rare sort of manager capable of running a large company sounds indecent, but I once worked out that if everyone got paid equally, we’d all get R4,000 a month, and the highly-paid employers would probably quit bothering to employ, so we shouldn’t expect long-term employment at that wage.
There is one final resort, which is useful if you can’t get a scary-big number even when using finely honed skills in stats abuse. Just say it’s not zero, and people will still be terrified. The crude version of this is simply anecdotal evidence. “I knew a fellow who smoked 30 a day and lived to 102, when he got run over by a bus.” It may be true, but I know someone who survived being run over by a bus and died three weeks later of lung cancer caused by smoking, so the first guy was clearly smart not to survive the bus accident.
A slightly more sophisticated approach is to publish a pamphlet that says X can cause Y, and if that has ever happened, or if it is even just theoretically possible, you’re telling the truth. That sort of statement is true for an extraordinarily large set of XY pairs. Take this press release from the World Health Organization: “IARC classifies radiofrequency electromagnetic fields as possibly carcinogenic to humans.” It reports that the risk of a particular type of brain tumour was 40% higher in heavy cell phone users. The Guardian ran the story under this headline: “Mobile phone radiation is a possible cancer risk, warns WHO”. The LA Times chimed in thusly: “Study links cell phones to possible cancer risk.” The story was duly posted on a website called CanCauseCancer, which lists a few of the things you should avoid doing because they can cause cancer. Its main defect is that it doesn’t list half of all things on the planet, though in its defence it does recommend you avoid aging.
But you see if you read the entire WHO report you’ll find they employed some statistical tricks. For a start, the 40% number is based on only a single unnamed study, making it highly dubious. More importantly, if true, it would raise the cumulative lifetime risk from 0.35% to 0.48%. Its classification of “possible cancer risk” essentially means that it could not positively rule it out.
A similar case of citing a percentage-change in a very small percentage happened with this Guardian headline: “Cancer risk 70% higher for females in Fukushima area, says WHO”. If you go to the original Reuters story, you’ll see the next paragraph says this raises the lifetime risk from 0.75% to 1.25%. If you also read the WHO report itself, you’ll find it is further limited to only the two most badly affected areas, so it does not affect many people. Indeed, as Bloomberg later wrote: “Fukushima Radiation Proves Less Deadly Than Feared”.
But that’s not as exciting as a 70% higher risk of cancer, now is it?
Even the term “carcinogenic” itself is suspect. Bruce Ames, a venerable biochemistry professor at Berkeley who developed one of the leading tests for potential carcinogens observed that natural chemicals were no more benign than synthetic chemicals. Of those tested, in each case about half turned out to be carcinogenic in laboratory animals when given in sufficiently large doses. This discovery, and the observation that the chance of high-dose exposure in ordinary circumstances was very low, led Ames to strongly criticise the tendency to whip up fear about every substance that, in theory, can cause cancer.
His interest isn’t just a scientific formality. It has a real impact on how people assess risk, what they fear, and what policy-makers do about it. “If you have thousands of hypothetical risks that you are supposed to pay attention to, that completely drives out the major risks you should be aware of,” he told the Journal of the National Cancer Institute.
Among the many nanny-state governments that beg to differ the State of California is perhaps the most notorious. It passed a law that requires manufacturers to apply a warning label when any constituents of their products might, according to the medical definition, cause cancer. The American Cancer Society points out that “not every compound labelled as a possible cancer-causing substance has been proven to the worldwide scientific community to actually cause cancer.” It further explains that the threshold for being considered carcinogenic (or a cause of birth defects) is just a single additional case in 100,000 people over 70 years. I’d wager that half of all fruit and vegetables, no matter how organic, would fall foul of such a law.
The Cancer Society adds that California’s law “cannot offer information to help the consumer figure out what the potential risk is and how to avoid it. … The Prop 65 labels only tell you that a product has something in it that might cause cancer or affect reproduction. They don’t say what the substance is, where it is in the product, how you might be exposed to it, what the level of risk is, or how to reduce your exposure.”
So we have an expensive law that requires a massive enforcement bureaucracy and tells you exactly nothing useful and achieves nothing, other than wasting money on scaring the public witless.
The role played by the regulatory state raises a final example of how incomprehensibly big numbers can be abused to scare people. It’s easy. Simply place them in a context with extraordinarily small numbers. Regulatory limits often reflect an extremely high level of risk-aversion, so they are set very low.
Returning to Fukushima, the headlines were full of scary comparisons: “Radiation 1000 times above normal at nuclear plant”, yelled a story in the Australian, which cites the Japanese government saying it’s only eight times higher. The UK Daily Mail can always be relied upon to trump a mere rag from the dominions when it comes to sensationalism, and it exploited an erroneous reading to run the headline: “Workers flee Japan nuclear plant as false readings say radiation levels are 10 MILLION times higher than normal”. Amid all the similar alarmism – this piece had water leaking from the plant at 100,000 times “normal” levels – it was hard to keep track.
But here’s the thing, “normal levels” are, well, normal. That is, they are low, and perhaps even near zero under ordinary circumstances. And even allowable limits are usually very conservative. An article on seawater radiation levels in the Fish Information & Services trade publication noted that 1,000 times normal levels was still only one-tenth of levels considered to be harmful.
It took XKCD, a webcomic famed among physics and mathematics geeks that routinely pokes fun at bad science, to give the world a useful comparative chart. From it you’ll learn the radiation a single member of the public may be exposed to by the nuclear industry in a year. You’ll learn that natural background exposure and medical scans expose us to four times more, and a single chest CT scan will give you seven times more radiation. If you work at a nuclear power plant, you’re permitted 50 times the exposure of an ordinary mortal, and that is still half of the lowest dose that can be statistically linked to increased cancer risk. You’ll also learn that the yearly radiation exposure due to natural potassium in the human body is 13 times higher than the regulatory target for a US nuclear power plant, and 50% higher than the regulatory limit.
A headline about radioactivity that says “exceeds allowable limit”, or “1,000 times higher than normal” does not intend to give you useful information. It intends to scare you.
So, the next time you buy a pesticide that claims to be “natural”, or a bed with a cancer warning, just laugh it off. And on the very slim chance that this article makes you ignore some real danger, blame the sensationalism of even serious science scribes, the emotional exaggeration of environmentalists and the precautionary principle politicians peddle as prudent policy. You’d be unlucky, but at least you didn’t live your life as a gullible, neurotic wreck. DM