Space & Innovation

How to Make Make Sense of Medical Warnings

Scary headlines can cause the public to be needlessly fearful. Understanding risk and statistics can help. →

Nearly 150,000 Canadian children were examined in a recent study, to examine a possible correlation between increased risk of developing autism and their mothers taking a specific class of popular antidepressants during pregnancy. The study, recently published in JAMA Pediatrics, found that the children of mothers who used SSRI antidepressants during the second and/or third trimester were at an 87 percent higher risk of developing autism than those whose mothers had not.

This study made national headlines and almost certainly frightened many pregnant women. It's only the latest in a series of such findings and reports. A 2014 report in the "Archives of General Psychiatry" found that fathering a child late in life increases the chances of the offspring having bipolar disorder.

Antidepressants in Pregnancy Linked to Autism Risk

The study generated headlines like "Children of Older Fathers may be at Higher Risk for Mental Illness" and "Older Fathers Linked with Bipolar Disorder." Indeed, the study concluded that "the offspring of men 55 years and older were 1.37 times more likely to be diagnosed as having bipolar disorder than the offspring of men aged 20 to 24 years."

Understanding Risk Medical researchers understand the difference between findings that are statistically significant and those that are practically significant - in other words that have a clear real-world effect. The 37 percent increase in bipolar disorder found among older fathers seems very dramatic until you realize that the incidence of the disorder in the general population is very low to begin with.

According to the National Institute of Mental Health, about 2.6 percent of adults in any given year can be diagnosed with bipolar disorder. A 37 percent increase would translate into about a 3.5 percent chance of a father over age 55 having a child with bipolar disorder. Therefore in the general population about 97.4 percent of children will not develop the disorder, and among fathers 55 or older the number drops to 96.43 percent-or about a 1 percent increase in the overall risk.

7 Surprising Facts About Pregnancy

Many journalists put the risk into perspective. An ABC News article noted that despite the alarming statistics linking autism and antidepressant use, medical experts advised that pregnant woman currently on the medications should absolutely not stop taking them without consulting her doctor.

"The absolute risk is still low and the odds are overwhelmingly in their favor that it's low," Dr. Max Wiznitzer, a pediatric neurologist at University Hospitals Rainbow Babies and Children's Hospital, told ABC News. As a piece on Discovery News correctly noted, autism remains a relatively rare condition and the vast majority of women who took the drugs did not have children with autism.

Unfortunately many people don't read past the scary headlines to the more sober and measured scientific caveats.

Autism ‘Patchwork' Begins During Pregnancy

In his book "More Damned Lies and Statistics: How Numbers Confuse Public Issues," Joel Best, professor and chair of sociology and criminal justice at the University of Delaware, explains how a news story can mislead, despite the best of journalistic intentions:

"Suppose that you pick up tomorrow's newspaper and read that a medical journal has published a study indicating that diet cola drinkers are 20% more likely to have a specific medical condition. Such a sentence will confuse some people, who, for example, may now believe that 20% of diet cola drinkers will get this disease. Actually, this statistic means nothing of the sort. Let's assume that, in the general population, 5 people in 10,000 have the disease. If diet cola drinkers have a 20% increased risk, there would be 6 cases of the disease among every 10,000 diet cola drinkers (20% of 5 is 1, so a 20% increased risk would equal 5 + 1, or 6). In other words, what might seem to be an impressive statistic-'20% greater risk!'-actually refers to a very small difference in the real world: 1 additional case per 10,000 people."

One way to look at the situation is that the most significant threats to human health have long been identified. It's unlikely, for example, that a valid new study will find that eating three ounces of pork every day for 20 years doubles your risk of death; if that were the case, researchers would have noticed decades ago that regular pork eaters were dropping dead at rates far higher than average.

Oxytocin May Help Autistic Children

Consumers should be skeptical of claims about anti-cancer "miracle foods" for the same reason: If a regular diet of garlic, spinach, kale or anything else dramatically cut cancer rates, that would have been medically established long ago.

Aside from needlessly alarming the public over a practically insignificant risk, the latest news may lead some mothers of autistic children to blame themselves for their child's condition, when in fact it's more likely than not that the autism would have emerged with or without the antidepressant usage.

Often the real take-home message to medical stories for the savvy news consumer is that while a risk was found in the latest study, it's not really anything most people should worry about - or are likely to do anything about anyway.

People make choices everyday that put them and their families at far greater risk of disease, injury and death-such as smoking, not wearing a seatbelt or texting while driving - than any miniscule (but statistically significant) action like eating a steak, fathering a child later in life or taking antidepressants during pregnancy.

"A characteristic of the American dietary that has persisted throughout years has been its abundance." This sentence is no less true today than when it was

published in the American Journal of Clinical Nutrition in 1959

. That abundance, however, comes with a cost: Americans eat too much. That excess consumption and often poor nutrition has brought with it a multitude of life-changing and often life-threatening diseases and conditions, including obesity, diabetes, heart disease, cancer and more. Although the number of Americans who are overweight or obese has been on a steady climb since the middle of the 20th century,

a recent report by the New York Times suggests

that efforts to abate this health crisis might be gaining ground thanks to a shift in public attitudes. For the first time since the federal government began tracking dietary intake over more than four decades of data collecting, the daily calorie intake of the average American showed a sustained decline. Read on to see how the American diet has changed since the middle of the 20th century.

Americans Falsely Believe Their Diet Is Healthy

While calorie counts have been on the downswing, calorie intake is still far above where it once stood. Americans' daily average caloric intake is over 500 calories higher than it was in 1970, when the average hovered around 2,169 calories per day. What does a more than 20 percent increase in caloric intake mean for the average American? Consider that a pound a fat contains 3,500 calories. Assuming that even a quarter of those calories represent excess energy beyond what's needed for daily maintenance levels, that translates into a pound of stored fat gained every month, or 12 additional pounds per year. Portion sizes have seen a similar increase over time.

According to the U.S. Department of Health & Human Services

, in the last two decades alone, food portions in American restaurants have doubled or in some cases tripled. Portion sizes began increasing in the 1970s and rose sharply in the 1980s. Many food portions greatly exceed USDA and FDA standard servings,

according to a study published in the American Journal of Public Health

. "The largest excess over USDA standards (700 percent) occurred in the cookie category, but cooked pasta, muffins, steaks and bagels exceeded USDA standards by 480 percent, 333 percent, 224 percent and 195 percent, respectively," the study found.

VIDEO: 4 Common Diet Myths Debunked

According to the U.S. Department of Agriculture (PDF)

, between the 1950s and 2000, Americans consumed on average 39 percent more refined sugars. Consumption of corn sweeteners, with high-fructose corn syrup leading the charge, octupled. On average, Americans consume an estimated 156 pounds -- yes, pounds -- of added sugar per capita every year. Excessive sugar intake can lead to all kinds of negative health outcomes, including but not limited to dental problems, obesity, diabetes, liver failure and more. These concerns are what led the

Food and Drug Administration (FDA) recently to propose

a change in food labeling that would recommend the daily intake of calories from added sugars not exceed 10 percent of total consumption. More than 70 percent of U.S. adults get over 10 percent of their daily calories from sugar.

7 Facts About Sugar That May Surprise You

Americans aren't just eating more sugars than they used to; we're also drinking more.

According to the Harvard School of Public Health

, before the 1950s, the standard soft drink size was 6.5 ounces. That decade, manufacturers started selling larger sizes, and by 1960 the 12-ounce can was everywhere. Fast forward 30 years, and 20-ounce bottles are ubiquitous. Today, single-use soft drinks can reach up to 64 ounces and have up to 700 calories. Since the 1970s, sugary drinks have grown from 4 percent of Americans' daily calorie intake to 9 percent. A quarter of Americans get at least 200 calories a day from soft drinks. Teens and children are particularly high consumers, too. Sugar drinks are the top calorie source for teenagers, and are consumed daily by an estimated 91 percent of children.

Does Soda Cause Violence?

Beginning in 1980, when USDA issued the first federal dietary guidelines implicating fats and cholesterol as a major source of Americans' health woes, particularly heart disease, the food industry began shifting formulas in their products that moved away from saturated fat and toward vegetable oils and carbohydrates. Low-fat diets became all the rage, with a bevy of product lines offering low-fat alternatives. What happened after Americans got turned on to low-fat foods? They got even fatter and less healthy. Foods may not have had as much saturated fat, but they made that up with an increase in sugar and refined grains, which kept calorie counts the same. The switch to hydrogenated vegetable oils and margarines also only increased health risks. Heart disease is still the number one killer in the United States, and we also have rising numbers of diabetes and obesity to contend with. In fact, despite numerous studies since the 1990s showing that low fat diets are ineffective at best and harmful at worst, the government continues to recommend a low fat diet.

FDA Bans Trans Fats - More Chemicals to Consider?

Go to the snack aisle of any supermarket or convenience store in the United States, and you'll find a wide array of potential options to satisfy any junk food craving. What do all of these foods have in common? They're all made of refined grains, in addition to other nutritionally deficient ingredients. Refined grains do not offer the same nutritional benefits as whole grains. The process by which refined grains are produced removes fiber, iron, vitamins and other nutrients,

according to the American Heart Association

. Examples of refined grains include white flour, corn flour and white rice. Not only have Americans turned to unhealthier grains for a greater share of their calories; modern wheat is also less nutritious than heritage grain varieties.

Why You Should Probably Stop Eating Wheat

As Americans increasingly eat out at restaurants more and consume more processed foods, salt intake has steadily increased across all age groups. The average American between ages 20 and 74 consumed close to 1,500 mg more sodium per day in 2006 than in 1971,

according to the Institute of Medicine of the National Academies

. The Centers for Disease Control and Prevention (CDC) recommends everyone ages 2 and up consume less than 2,300 mg per day, which is about a teaspoon a day. Some people with certain health risks, such as diabetes or high blood pressure, should limit that further to 1,500 mg per day. Eating too much salt increases water intake in the body, which leads to high blood pressure. Kidneys work overtime to deal with the excess sodium, and the increased blood volume puts a strain on the heart and blood vessels. At worst, a lifetime of a sodium surplus could lead to heart failure or stroke,

according to the Harvard School of Public Health

.

Fast Food Is Less Salty Abroad

In 1929, presidential candidate Herbert Hoover ran under the slogan "a chicken in every pot and a car in every garage." These items might seem common enough today, but Hoover was basically promising what were then considered luxuries to every American household. At the time of Hoover's campaign, "the few chickens raised for meat were sold directly to high-end restaurants, first-class dining cars, and luxury caterers,"

according to the Smithsonian Museum of American History

. Chickens were instead kept for their eggs, which are a valuable source of protein and other nutrients. Egg consumption peaked mid-century, and has been in decline ever since. The overblown connection between cholesterol in eggs and heart disease certainly contributed to Americans turning their back on eggs. But a decline in prices of another protein source, specifically chicken, also contributed to eggs falling out of favor. In the last half of the 20th century, poultry consumption went up more than three-fold, according to USDA data. Red-meat consumption saw a steady decline over the same period, with Americans eating roughly 10 percent fewer pounds per capita every year.

VIDEO: Stop Eating Food: The Soylent Experiment