Why Being Fat Is Not Always Obvious
June 28, 2012 --
Today the Supreme Court upheld the 2010 health care law in a dramatic victory for President Barack Obama. The lead up to today's decision has prompted debate between opponents and supporters of the Patient Protection and Affordable Care Act two years ago. Take a look at how we got to the health care system we have in place today.
Do Mexicans Have Better Health Care Than You?
Prior to the 20th century, nothing even close to what could be called a health care system existed in the United States. Although the Civil War had led to some medical breakthroughs in terms of surgical techniques and pain management, medical knowledge, techniques and treatment availability at the time left little hope that patients would actually recover from severe ailments. As NPR's Alex Blumberg and Adam Davidson point out, medical treatments may have been downright medieval at the time, consisting of potions. But at least it was cheap. "In 1900, the average American spent $5 a year on health care ($100 in today's money)," they note in their report.
How the Civil War Changed Modern Medicine
In 1912, Theodore Roosevelt was the first presidential candidate to get behind the idea of a national health insurance plan. Roosevelt ultimately didn't win election that year. Proponents of government-provided health care tried to press the issue through state initiatives, only to see their efforts fail in 16 states. Roosevelt's plan may have certainly been ahead of its time, particularly since there weren't that many services that doctors could actually provide patients during that era.
At the same time, however, developments within the medical community changed the face of the industry. The horrors of World War I led to advances in the areas of wound care, sanitation, pain management and more, according to an article published in the Journal of the Royal Society of Medicine. Hospitals in the United States began to widely adopt the practice of using antiseptics to sanitize their facilities, preventing the possibility of medical personnel or patients becoming exposed to infection. That decade also saw the introduction of the first employer group insurance contracts (though not specifically for health insurance) as well as the first physician service and industrial health plans.
In 1928, Alexander Fleming made one of the most important discoveries in the history of medicine: penicillin, a life-saving drug used to treat countless millions. It would be decades, however, before penicillin would be mass-produced. Fleming's discovery was the signature achievement in an era that saw medical treatment become more effective, and, as a result, expensive. The Great Depression also fueled concerns about affordability of medical treatment as millions of Americans suddenly found themselves out of work. In 1929, Baylor Hospital provided the first group health insurance plan in the United States through an agreement with Dallas-area teachers. The plan was the forerunner of Blue Cross. The effort wasn't just meant to be in the best interests of patients, but also the hospitals. Patient facilities saw more empty beds as fewer patients during the Great Depression could afford treatment without participating in these collective prepaid health insurance plans.
Ancient Nubians Drank Antibiotic-Laced Beer
As part of his push to create a social safety net for Americans during the Great Depression, President Franklin D. Roosevelt advocated the passage of national health insurance. Roosevelt pushed ahead with efforts to pass Social Security first, a bill which intentionally omitted any mention of medical care to ensure its passage. Harry Truman attempted to carry on Roosevelt's legacy in 1945 by calling on Congress to create such a program. His efforts failed, partly due to criticism by the American Medical Association (AMA), who called the plan "socialized medicine." In this photo taken in 1937, First Lady Eleanor Roosevelt examines a chart of enrollment of health care insurance plans.
Like its predecessor, World War II would lead to new medical advancements, including the widespread adoption of antibiotics and the use of ultrasound. The war would also have a similar effect in terms of the spread of employer-sponsored health plans. Because the nation was in a state of emergency and had a legally mandated wage freeze as a result, employers had to attract workers to assist the war effort by providing them with benefits, including health insurance. Tax laws passed between 1943 and 1945 also gave breaks to employers who provided insurance to their employees, which gave businesses all the more incentive to offer coverage. Following the war, employer-sponsored health insurance became common. In 1951, around 77 million Americans had some kind of coverage, according to an insurance industry trade group. That era also saw one of the most celebrated medical achievements in history: Jonas Salk's polio vaccine.
NEWS: World War II
Although health insurance was widely available to employed Americans in the mid-20th century, the unemployed and the elderly were often excluded from these plans. President John F. Kennedy campaigned on the issue of insuring these groups. President Lyndon B. Johnson succeeded where Kennedy left off, securing the passage of a bill through Congress creating Medicare and Medicaid. At the bill-signing ceremony, shown here, Johnson presented former president Truman with the nation's first Medicare card. Within the medical industry itself, an increasing number of doctors began specializing in certain fields of medicine rather than acting as general physicians. By 1960, more than two-thirds of doctors reported themselves as full-time specialists, rather than general practitioners.
Starting with Richard Nixon in 1970, presidents have offered successive plans for covering the nation's uninsured, but they have have stalled for different reasons. In 1974, Nixon put forward a plan to cover all Americans through private insurance, only to have the Watergate scandal force him out of office. An economic crisis prevented Jimmy Carter from pushing forward with a national health plan. Congress late in Reagan's second term attempted to expand Medicare, only to have the law repealed the following year. Bill Clinton had a 1,300-page health care reform bill that was never even taken up for a vote in Congress. Since Nixon's presidency, health care costs have continued to rise, often outpacing inflation. This increase is due to a number of factors, including the increased use of new medical technologies for diagnosis and treatment. The Patient Protection and Affordable Care Act signed by President Barack Obama was intended to cover the 30 million Americans who live without health insurance, according to the bill's authors. It has been the most far-reaching piece of health care legislation since Johnson's signed the legislation creating the Medicare and Medicaid health care programs.
NEWS:Think You'll Live Long? Think Again
Nearly 2/3 of parents underestimate their children’s weight, and half of parents do not recognize that their children are overweight or obese, reports a new study published in the journal Pediatrics.
For the study, “Parental Underestimates of Child Weight: A Meta-analysis,” a team of researchers reviewed 121 published studies containing more than 80,000 estimates of a child’s weight by his or her parent. The researchers concluded that “half of parents underestimated their children’s overweight/obese status and a significant minority underestimated children’s normal weight.”
The results are surprising to many people, given the widespread concern over children’s health and high-profile measures — by Michelle Obama and others — to start healthy behaviors early.
The idea that many people don’t notice overweight children in our famously fat-phobic society seems absurd. After all, aren’t all Americans constantly judging themselves and others over every unwanted ounce?
Actually, no. Two-thirds of American adults are overweight, and more women than men are obese, yet fewer than 1/4 are dieting at any given time. Only a minority of Americans eat a healthy diet, and fewer than 1/3 get regular exercise.
The study is very much in line with previous research showing that parents often underestimate their children’s weight, and many people underestimate their own weight.
A 2010 study published in the journal “Obstetrics & Gynecology” found that nearly 40 percent of overweight women and 10.5 percent of obese women believed themselves to be underweight or of normal weight. Surprisingly, though it’s often claimed that most women think they are too fat, only 16 percent of normal-weight women in the study perceived themselves as overweight.
There are several possible reasons why being overweight is not obvious to many people, including depictions of overweight individuals in the media that tend to be extreme (i.e., morbidly obese) instead of overweight, and thus viewers have a skewed idea of what health-damaging obesity looks like.
For example the contestants on the hit reality television show “The Biggest Loser” are not merely a few pounds overweight — since dropping a few pounds would not make a dramatic success story — but instead may weigh two or three times as much as they should at a healthy weight.
Viewers may compare themselves to the contestants and assume that their own weight — though admittedly a little high but nothing to really worry about — is acceptable because at least they don’t tip the scale at 400 pounds.
Another factor may be a social hypersensitivity to acknowledging that a child is overweight for fear of possibly triggering eating disorders such as anorexia nervosa or bulimia.
A high-profile example of this concern arose in 2011 when a book titled “Maggie Goes On a Diet” was published. It’s about an overweight teen who chooses to lose weight to feel better about herself, get healthier, and play a sport she loves. The book was widely criticized as dangerous — mostly by those who had not read it — because they assumed that the word “diet” in the title referred to calorie restriction. However it did not. In the book Maggie loses weight in the way that doctors and nutritionists have recommended for decades: making healthy food choices and getting exercise.
The concern than an overweight child should not be encouraged to lose weight for fear of it being taken too far is largely unfounded. Eating disorders such as anorexia nervosa are mental illnesses, caused by several factors including genetics.
Calorie-restrictive dieting can contribute to anorexia in rare cases — about 1 percent of the population who already have a predisposition for the disorder. But the vast majority of people who lose weight as doctors recommend — healthy diet and exercise — benefit greatly and are at little or no risk for eating disorders.
Similar concerns arose in 2010 when medical research found that obese women are up to 60 percent more likely to develop cancer as compared to healthy-weight women, and that up to a third of breast cancers in Western countries could be avoided if women ate less and exercised more.
As Associated Press reporter Maria Cheng noted, there was a reluctance on the part of many doctors to make too much of the study: “Any discussion of weight and breast cancer is considered sensitive because some may misconstrue that as the medical establishment blaming women for their disease.”
Cheng is referring to an institutionalized “fat taboo,” in which some fear that if women — statistically most of whom are overweight — are (truthfully) told that they can reduce their risk of cancer by slimming down, they will blame themselves if they get cancer.
The issue of not recognizing being overweight or obese is a serious concern for doctors, especially when it relates to children. A parent is responsible for the health of their child, ,and if they don’t recognize that he or she is overweight or obese, that could have dire health consequences because they will not take steps to address the problem.