Monday, May 02, 2011

Placebo: How a sugar pill became a poison pill. Part 6 of a continuing saga...

Read Part 5.

This is where you might expect journalists to “fact check” the self-serving fluff and, as necessary, set the record straight. Regrettably, the climate of arm's-length detachment that should separate reporters from their sources does not apply in medical journalism. Health reporters tend to be in the thrall of celebrity doctors and research scientists to begin with, and undertake little true investigative journalism that isn't spoon-fed to them by the rare healthcare dissident or some crusading personal-injury lawyer. News reports on major healthcare scandals—drugs that kill people, doctors whose supbar skills have invited scads of malpractice lawsuits—are framed as aberrations, departures from the norm. “NEW MEDICAL BREAKTHROUGH!” constitutes one of the timeless feel-g
ood themes that the media rely on to leaven the otherwise-relentless onslaught of bad news. And an improvement in longevity is the supreme feel-good story.

No less a media heavyweight than Time, commenting on life-expectancy figures released by the CDC in December 2009, uncritically repeated researchers' contentions that “improvements in life expectancy are largely due to improvements in reducing and treating heart disease, stroke, cancer, and chronic lower respiratory diseases.” The merest peek behind the curtains would've resulted in a very different headline.

On paper, the upswing in American longevity since 1900 is difficult to ignore: about 49 for both genders then versus about 78 for both genders now, an apparent gain of nearly three full decades. But this striking then-and-now statistical juxtaposition has been framed in the public dialogue as if mass numbers of Will Rogers' contemporaries suddenly keeled over on their 49th birthdays. Nothing could be farther from the truth. Seldom has a data set been more deceiving or a statistical “fact” more spurious. The credulous reporting of those “facts” bespeaks a woeful misunderstanding of the concept of life expectancy.

Most laypeople (and too many journalists with a rudimentary knowledge of the health beat) unthinkingly use the terms longevity and life expectancy interchangeably. They regard the entire subject as a one-dimensional computation that yields a single fixed number—that number being the age at which an adult can expect to die: “Well, I'm a 74-year-old man, and male life expectancy is 75, so if there's anybody I've always wanted to tell off, I've got one year to do it!” Not so. Life expectancy—as the term is used by scientists, demographers, actuaries and allied professionals—is a sliding scale. Somewhat like a GPS navigational system that recalculates your route if you miss a turn, life-expectancy tables recompute your odds of dying at each new age plateau you attain. That new calculation is made based on the average number of additional years of life logged by others who have reached the same plateau. In scientific and actuarial circles, this is known more specifically as “life expectancy by age.” Among other things, it's the primary basis for life-insurance underwriting.

When the media and general public make casual reference to longevity, they actually mean “life expectancy at birth”: the average final age attained by all members of a given universe born in a given year, encompassing everyone from that rare centenarian in the nursing home down the street to babies who barely managed to take their first breaths before dying. Projections of future life expectancy are based on observed experience as that entire data universe inches forward a year at a time. The current figure for life expectancy at birth is 75.3 years for men and 80.4 years for women, which resolves to 77.9 years. In no way, however, does this imply that a man who actually reaches age 75 should spend his birthday shopping for caskets and lining up a favored eulogist. In actuarial terms, a male who attains that milestone today has a life expectancy of an additional 10.8 years.

To no small degree, as the 19th Century gave way to the 20th, life expectancy was tied to one's luck at avoiding a trio of infectious diseases that stalked and killed with impunity. In 1900, the combined U.S. death rate from tuberculosis, flu and pneumonia was 396.6 per 100,000 population. (To put that in context, the current death rate from all cancers combined is 200 per 100,000 population.) TB alone claimed 194 lives per 100,000. The disease was commonly called “consumption” for its profoundly debilitating effect on late-stage victims, who appeared to waste away as if being consumed from the inside out. So severe was the panic surrounding TB that dedicated sanatoriums sprang up on the outskirts of dozens of cities for the express purpose of “treating”—that is, quarantining and warehousing—victims of the highly contagious killer.

So final a death sentence was a diagnosis of tuberculosis thought to be that doctors at these sanatoriums felt they had little to lose by attempting surgical interventions which, for sheer barbarism, rivaled anything thought up a few decades later by Josef Mengele. In the most gruesome of these, doctors would remove a patient's rib cage and encircling musculature in the theory that excising these “obstructions” might literally give a patient more breathing room. Such measures only inflicted horrific pain and in most cases hastened death.

Life in turn-of-the-century America was also marred by “slate-wiper” pandemics such as the Great Flu of 1918-1919, and random but regular outbreaks of polio and diphtheria, the latter disease one of the most feared blights among children prior to the 1930s. Terrified mothers kept their kids indoors after school, or kept them home from school altogether. The mere rumor of a child up the street who'd fallen ill was enough to drop attendance in city schools to levels that render modern America's worst truancy problems hardly worth discussing. Birthday parties ended the moment a child coughed or complained of feeling unwell. And yet here's the thing: The most dramatic breakthrough in the bedrock measure of U.S. mortality, deaths per 1000 population, happened between 1900 and 1930—when nothing dramatic at all was happening in medicine to account for it. In the earlier of the two years the Grim Reaper was frightfully busy, targeting 17.2 of every 1000 Americans; in a typical tenement of the sort that had begun to crowd the skies of lower Manhattan by 1900 (as depicted in films like Hester Street and Godfather II), residents attended five or six funerals. By the latter year the Reaper's day was far less busy at 11.3 deaths per 1000, largely because the cumulative toll from the aforementioned “big three” of TB, flu and pneumonia had plummeted by more than half, from 396 to 173 deaths per 100,000. That precipitous drop, it's clear, had little to do with medicine and everything to do with a massive public-awareness campaign emphasizing nutrition, sanitation and personal hygiene. After all, the gold standard TB-zapper streptomycin was still years removed from being isolated (1943), doctors treating pneumonia would not have penicillin in their arsenals till after World War II, and human trials of Salk's polio vaccine would not commence until 1954. So it's not that the lifespan of homo sapiens Americanus was magically extended by onrushing healthcare know-how. It's more that the unfortunate background circumstances that skewed the stats, leading to preposterously conservative assumptions about the limits of longevity, began to remit on their own.

If fewer people died, it was mostly because fewer people got sick to begin with.

To be continued...

No comments: