Sunday, May 08, 2011

Placebo: How a sugar pill became a poison pill. Part 7 of a continuing saga...

Read Part 6.

Despite such subsequent advancements as the first influenza vaccine (1945), the first open heart surgery (1952), the first kidney transplant (1954), and the World Health Organization's official (if premature) declaration of the defeat of smallpox (1980), U.S. death rates remained remarkably level over the 60-year period between 1948 and 2007: 9.9-per-1000 in the earliest year, 8.0
in the final one. Adjusting for some of the plague-like factors that wiped out mass numbers of Americans in the bad old days, the longevity enhancements of the current era are shockingly modest.

Put simply, not that many adults are living that much longer than in years gone by. During the Civil War era, a 70-year-old man could expect to live to 80. In 1950, that same 70-year-old man could e
xpect to live to—80. No measurable gain in a full century of medical progress!

Little has changed since, either, in spite of an endless array of pharmaceutical therapies, an aggressive, multifocal counterattack on cancer, and the myriad socially entrenched insights about proper health maintenance. Further, during the past half-century society has witnessed the proliferation of health-insurance plans that put these innovations within financial reach of most Americans. Nevertheless, the longevity of the average 70-year-old has increased by about 3.5 years over what it was when John F. Kennedy took office.

If this multigenerational parity still seems ludicrous on its face, consider the Founders. Washington died at 67, a bit young by present standards, but Franklin and Madison were 84 and 85 at their deaths. Jefferson died at 83, poetically on the same day, July 4, 1826,
as his dear friend, John Adams, who was 90. Adams' son, John Quincy Adams, reached 80. Samuel Adams was 81. Andrew Jackson was 78. James Monroe attained 73. John Jay, 84. Hamilton died at 49—in the infamous duel with Aaron Burr, who lived to see 80. We can go even farther back. In her piece, “Dead at 40,” Carolyn Freeman Travers, research manager of the Plimoth* Plantation restoration, cites the supposition of modest life expectancy as one of “several common pieces of misinformation/mistaken beliefs about people in the past.” Of Massachusetts' Andover settlement she writes, “Circumstances evidently combined to encourage a high birth rate and an exceptionally low death rate, a combination which produced a population that grew at a rapid pace.” Citing the research of historian Philip Greven, Travers continues, “The average age of twenty-nine first-generation men at the time of their deaths was 71.8 years, and the average age at death of twenty first-generation wives was 70.8 years.”

The spectacularity of these trends was not lost on contemporaries. In 1644, William Bradford, long-time governor of the Plymouth Colony, wrote, “I cannot but here take occasion not only to mention but greatly to admire the marvelous providence of God! That notwithstanding the many changes and hardships that these people went through, and the many enemies they had and difficulties they met withal, that so many of them should live to a very old age!”

BY FAR THE most important variable in the science of longevity is the reckoning of individuals who become a death statistic at birth or soon after. No single factor has more decisively swayed the pendulum, for better or worse, than infant mortality.

Turn-of-the-Century America was an inhospitable place for newborns. In several American cities up to 30 percent of babies died before taking their first steps, and as you might imagine, some of the individual stories from this era are grievously tragic. One such story concerns the Coswells of Illinois. Between 1894 and 1907, Mary Coswell delivered no fewer than five stillborn children. With the fifth birth, Mary herself died. Another contemporaneous account mentions (but does not name) a husband and wife who, after the woman's second stillbirth, went to a nearby overlook and in front of horrified bystanders, joined hands and leaped to their death.

It's hard to overstate infant mortality's impact on longevity numbers from the early 20th Century. In 1900, a male child at birth had a life expectancy of about 48 years—but if he survived to age 1, his remaining life expectancy jumped instantly to 54 years. That gain represents the “write-off” of first-year mortality: With the appalling toll in infant deaths now shunted back to an earlier data set, the rest of the cohort “gains” an instant six-year longevity benefit (in much the same way that grading on a curve lops off the lowest marks and allows the arithmetic mean to rise commensurately). Over the ensuing decades the first-year gap narrowed, then disappeared. By 1980, that first completed year subtracted from remaining life expectancy, just as all subsequent years do.

Here's another way of looking at it. In 1920, when life expectancy at birth was a shade above 56 years, the infant-mortality rate stood at 85.8 deaths per 1000 live births, or 171,000 infant deaths in total; few of those aforementioned tenements remained untouched. By 2000, life expectancy at birth had risen to an even 77, and the infant-mortality rate had dropped to 6.9 per 1000 live births, or 28,000 infant deaths nationwide. Had 1920's rate of infant mortality still applied in 2000, the total number of infant deaths that year would've skyrocketed to well over 300,000. Those additional deaths at “age zero” would've chopped seven full years off 2000's overall life expectancy of 77.

And it gets worse. In 1920, the maternal death rate—representing women, like poor Mary Coswell, who died of pregnancy-related complications or during childbirth—was just under eight women for every 1000 births. By 2000 that grim statistic had been sliced to near-nonexistence: just one woman for every 100,000 births.

But again, let's assume 1920's death rate still applied in 2000, and that each of the 4 million births that year represented one mother (i.e. leaving out the nominal statistical impact of twins and other multiples). That simple exercise in “what if?” adds some 30,000 maternal deaths to our hypothetical mix. Inasmuch as the age of the typical American woman giving birth is 25, those 30,000 premature deaths lop another full year off overall longevity figures.

In 1912, President Taft signed a bill that ordered the creation of the Children's Bureau, which embraced as its charter mandate a full-out assault on the nation's alarming rate of infant mortality. Over the next decade that goal drew on the wisdom of the finest minds in public health, clinical medicine and social welfare. This tandem effort at first centered on the sanitary processing and handling of milk, then shifted its emphasis to other areas of hygiene and education, then took up the matter of promoting comprehensive infant- and maternal-welfare services. These initiatives wrought a sea change in the medical and cultural view of childbirth: from a historical model of care that kicked in only after delivery, to a comprehensive program of prenatal mentoring and monitoring.

The dividends began to show themselves almost immediately. Although healthcare indisputably played a supporting role in extending the lives of countless infants from that period, the bulk of the care was preventive, not interventional. It was seldom a case of “treating” newborns who fell ill. Rather, the goal was to prevent newborns from falling ill to begin with.

In any case, by the time World War II GIs returned home and put down roots, the major victories in this omnibus war on infant mortality had been won. Thus America's all-important triumph over infant death was achieved in large part without today's costly “miracles of modern medicine.” Sonograms and fetal heart monitors—now deemed obligatory elements of a proper prenatal regimen—weren't invented till the late 1950s, and wouldn't come into general usage for two decades more. It's fair to say that their impact on infant mortality has been negligible. Indeed, if one wanted to be a curmudgeon, one might point out that in recent years, America's infant-mortality rate has crept back up slightly—this, in an era when expectant mothers can avail themselves of a panoply of health services that their forebears from Model T America could not have imagined.

* She uses the Colonial spelling.

1 comment:

our friend Ben said...

Thank you for voicing one of my greatest gripes. Like idiots who try to show their superior knowledge by proclaiming that "tomatoes are fruits, not vegetables!" when numerous vegetables, including peppers, eggplant, pumpkins, and squash are also technically fruits (as corn is technically the seedhead of a grass), the folks who proclaim that moderns live longer and previous generations were considered old at 20 (or 30 or 40) are simply ignorant. If you survived birth (or giving birth) and the occasional plague, and had the means to provide yourself with warmth, shelter, and food, you had an excellent chance of living to what we still consider a ripe old age. As you rightly point out. Cancer, diabetes and heart disease, our own killers, were vanishingly rare before the 20th century; you'd probably have had as much chance of dying of leprosy. George Washington, that hale and hearty soul, might have made it to 100 had he not caught pneumonia, and then been bled to death by the helpful doctors of his day, all from riding over his estate in a cold rain, then refusing to change into dry clothes because the delay would incommode his dinner guests. What is unquestionably true is that more people are living to extreme age (90s and 100s) than ever before in human history. But I credit that to better knowledge of nutrition, the abundance of food, and better dentistry rather than to modern medicine. People can now keep their teeth (or reasonable facsimiles) throughout their lives and use them to consume healthful fare year-round, rather than ending their days as toothless ancients gumming milk-sopped bread and gruel, when they could get them...