Last week, a study was published in the Journal of the American Medical Informatics Association that some media outlets interpreted to mean online reviews aren’t helpful to consumers when choosing a doctor. We beg to differ.
Yelp exists today because its founder, Jeremy Stoppelman, fell ill in 2004 and wanted recommendations for a doctor in San Francisco. He couldn’t find any useful information online, so he built a platform to make it possible for people to share and find reviews of doctors — and every other kind of local business. Ever since, health care providers have been crucial for Yelp, and while most people eat out and go shopping more often than they get sick, they still turn to us to read and contribute millions of reviews in the health category, including hospitals, cannabis clinics, sports psychologists and pediatricians.
The study, “Online physician ratings fail to predict actual performance on measures of quality, value, and peer review,” compared ratings of 78 Cedars-Sinai Medical Center doctors on Yelp and other online rating sites with different measures of the same doctors’ performance, including cost of care and reviews by their peers and administrators. It found no correlation between the two types of ratings, though ratings of the same doctors on different sites was fairly consistent.
We’re glad that the authors found consistency in online ratings. Subjectivity and specific circumstances can affect any one review, but bundle a set of reviews and they appear to be measuring something persistent about a doctor’s care. All parts of someone’s time at the doctor’s office are essential to the overall patient experience and likelihood of seeking care in the future: the office staff, the bedside manner, the follow-up communication, the wait time on the office phone line. Peer evaluations made by other doctors are unlikely to capture those elements.
While some prior studies have found little link between certain online ratings and other measures of doctor performance, Yelp ratings of hospitals have been shown repeatedly to correspond with objective quality metrics including potentially preventable readmissions — and often provide more information than traditional hospital evaluations such as the Hospital Consumer Assessment of Healthcare Providers and Systems Survey (HCAHPS). Other studies have found similar connections between non-Yelp online ratings and health outcomes for hospitals.
Beyond the study’s findings, academics have also chimed in on the parameters of the research.
“My initial reaction is an observational study of 78 physician ratings seems like a rather small sample to make conclusions across 8 different specialty areas!” said Prof. Brad Carlin, professor of biostatistics at the University of Minnesota School of Public Health and an American Statistical Association expert, in an email.
Dr. Naomi Bardach, associate professor of pediatrics and health policy at the University of California, San Francisco, and co-author of one of the studies showing the relationship between Yelp ratings and other hospital scores, said that the newest study was comparing online ratings to facets of medical care patients might not be aware of, such as steps taken by doctors to drive down costs. “I don’t think that the study invalidates the consumer ratings, it just compares them to measures that I would not expect to be transparent to consumers,” Bardach told me in an email. (Yelp has no financial relationship with anyone quoted in this article.) “Hence, consumer observations of other aspects of care provided by these specialists might be quite valid and important to other consumers.”
We’re also glad that the authors share our goal of making more measures of health-care quality available to the public. “Consumer ratings of physicians should be paired with information on quality and value of care in order to help patients make more informed decisions when selecting providers,” they wrote. That’s exactly what we’re aiming to do with our consumer protection initiative for hospitals and other health-care services. The measures used in the latest study, as well as many other doctor scores, are not public, whereas online ratings and reviews are transparent, and doctors can choose to respond to them. Medical practices and facilities already are doing lots of work to evaluate doctors. We’d love to see patients get access to more of the product of that work to help inform their decisions.
We were disappointed, though, by press coverage suggesting the study shows patients shouldn’t Yelp their doctors. The headline of a Bloomberg article about the study – “Don’t Yelp Your Doctor. Study Finds Ratings Are All Wrong.” – “is too strong and not entirely reflective of our research,” study co-author Timothy Daskivich told me in an email.
To better understand the value of Yelp reviews of doctors, and continue the project our CEO started to get better doctor recommendations, we need more people to write more of them. One possible reason online ratings didn’t match up with evaluations of doctors in the Cedars-Sinai Medical Center system is that many doctors hadn’t been rated often, Bardach said. “If they had higher numbers of ratings, that were not so close to the minimum, I would feel more confident in their findings,” she said. Headlines distorting a study’s findings can limit our ability to find great doctors and understand what makes them great.
We also need further study. We’d be glad to share data with researchers who want to further study our health ratings, as we already do with ProPublica through our longstanding partnership with them.
We’re always looking for more third-party health data to share with users because we know they can measure different aspects of health care than our reviews do. A hospital with crowded emergency rooms might score below average in that measure; if patients do get outstanding care when it is finally their turn, they can reflect that in their reviews.
In 2013, Yelp began incorporating health-inspector ratings on restaurants, while creating a data standard to enable more jurisdictions to ensure their hard-earned knowledge about hygienic standards is shared with people when they’re deciding where to eat. In 2015, we started displaying key statistics and consumer health survey results on the pages of thousands of hospitals, nursing homes and dialysis centers. And two months ago, we began sharing further information about maternity wards in California.
We will continue to look for opportunities to expand the data we make available on Yelp health-care pages. “We believe there is a role for online ratings and would be happy to work with Yelp to enhance their value by coupling online ratings with additional data of relevance,” Daskivich wrote in the email. We’d be happy to, too.
Related: Naomi S Bardach, associate professor of pediatrics and health policy at the University of California, San Francisco, responded to the Journal of the American Medical Informatics Association study in a letter to the editor published in the June 2018 issue.