One Strange Health Statistic That Could Improve American Healthcare

• Bookmarks: 44


One of the Affordable Care Act’s stated goals is to “[put] consumers back in charge of their health care” and allow them to make better-informed decisions. Given the enormous information disparity between doctors, who have gone through medical school, and patients, who usually haven’t, this goal may seem disingenuous. What health information could patients have that their doctors don’t?

A new study in PLoS One looks at a surprisingly simple metric, consumers’ self-rated overall health from “Excellent” to “Poor,” and finds that it has become increasingly powerful at predicting mortality over the last few decades. The results suggest that there may be opportunities to increase efficiency and decrease costs in the health sector, while keeping patients empowered.

There is a well-known correlation between self-rated health and lifespan (a proxy for overall health), independent of observable health metrics. This suggests that patients often have intuition about their overall lifestyle that can complement a physician’s expert judgment. While this correlation has persisted for decades, there is concern that changes in healthcare such as the “medicalization” of society, widespread and often inaccurate information online, and direct prescription drug advertising to consumers, may cause it to deteriorate.

To address these issues, Jason Schnittker and Valerio Bacak looked at data from the General Social Survey (GSS), a decades-long study of tens of thousands of people collected by the National Opinion Research Center. Between 1980 and 2002, the GSS asked respondents to characterize their overall health as “Excellent,” “Good,” “Fair,” or “Poor.” Schnittker and Bacak could measure the link between self-rated health and time-to-death because the GSS has been linked to a mortality index—showing when respondents passed away.

The state of existing research is represented in the graph below, which averages 1980-2002 GSS data to show that the percentage of respondents remaining alive at a given age is linked to their rating:

Source: “The Increasing Predictive Validity of Self-Rated Health,” Jason Schnittker and Valerio Bacak, PLoS ONE, 2014 9(1).
Source: “The Increasing Predictive Validity of Self-Rated Health,” Jason Schnittker and Valerio Bacak, PLoS One, 2014 9(1).

Schnittker and Bacak’s main contribution to this picture was to study how the accuracy of self-rated health has changed over time. They found a significant improvement in its predictive power, as shown in the following graph comparing ratings specifically made in 1980 and 2002:

Source: “The Increasing Predictive Validity of Self-Rated Health,” Jason Schnittker and Valerio Bacak, PLoS ONE, 2014 9(1).
Source: “The Increasing Predictive Validity of Self-Rated Health,” Jason Schnittker and Valerio Bacak, PLoS One, 2014 9(1).

Those considering themselves in “Excellent” health in 1980 were only marginally more likely to survive into old age than those in “Poor” health, but by 2002 the disparity widened enormously. The reason, the researchers hypothesize, is not that those in “Poor” health are dying faster, but rather that respondents are more knowledgeable – those with high mortality risk are much more aware of that risk and likely to place themselves in the “Poor” category, and those in better health are more likely to choose a higher rating. Schnittker and Bacak found that on average, between 1980 and 2002, the accuracy of a rating of “Poor” has been improving by 5 percent every year. This is astounding success for the average consumer’s subjective rating on a four-part scale.

The researchers repeated the study while controlling for gender, health behavior (such as smoking), intelligence, education, and deaths from accidents or “random” diseases like cancer and cardiovascular events, and found that the prediction improvements persisted in almost all subsections of the population.

Furthermore, they looked at how health information may improve or deteriorate the accuracy of these ratings by using a survey conducted among GSS respondents in 2000 and 2002, which asked them where they get their regular information and whether they sought health-specific information from that source. In general,more information meant better predictions, counter to expectations of a “medicalized,” paranoid, or overloaded public:

Those who regularly read the newspaper, for instance, gave themselves health ratings that predicted their lifespan 49 percent better than random chance; if they read the health section of the newspaper, their ratings became 128 percent better than random chance. One notable exception is health information from the Internet, which drastically lowered the predictive power of self-rated health, possibly confirming the researchers’ expectations that information online may be inaccurate or hyperbolic enough to harm consumers’ perceptions of their health.

The accuracy of consumers’ self-rated health may be driven by subjective information that their doctors may not feasibly ascertain (e.g., how well they eat, how much they really exercise, and whether they feel tired in the mornings). While it would be interesting to see how self-rated health matches up with doctor-rated health, Schnittker and Bacak have confirmed that consumers’ subjective and presumably ill-informed ratings can function effectively as proxies for actual health, especially when the consumers seek health information, and that these ratings have been improving significantly over time. The results appear to offer support for policies that enable patients to make general health-related decisions, such as how often to seek checkups and whether to buy pricier insurance, using one of the best information sources around: themselves.

Article Source: Jason Schnittker and Valerio Bacak, “The Increasing Predictive Validity of Self-Rated Health,” PLoS One 9, No. 1 (Jan 2014).

Feature Photo: cc/(O’Connor College of Law)

104 views
bookmark icon