Skip to main content

New Study From FTC Economist Compares Yelp Review Quality with Competitors

Key Findings From “Do Bad Businesses Get Good Reviews? Evidence from Online Review Platforms”

      • Yelp’s review ratings tend to be evenly distributed — similar to what the average consumer experiences with local businesses in the real world, whereas other platforms have a skewed rating distribution that favors higher star ratings.
      • Yelp’s average review contains approximately 593 characters, whereas 45% of Google reviews contain less than 100 characters, nearly half of which are no-text reviews. Yelp requires review text and about 26% of reviews contain between 501 and 1,000 characters.
      • FTC Economist Devesh Raval found that businesses with reviews filtered by Yelp, as well as the BBB’s review algorithm, had noticeably higher average star ratings on Google, Facebook, and HomeAdvisor. Raval estimates “that about half of the difference between Google and Yelp ratings of low quality businesses are due to fake reviews, and at least about a quarter of Google reviews for low quality businesses are likely fake.”

Yelp exists to empower and protect consumers. It’s why millions of people come to our platform every day — the reviews and ratings on Yelp are consistently good predictors of their own offline experiences with local businesses. 

Devesh Raval, an economist at the Federal Trade Commission (FTC), released a study last month that analyzed how review ratings differed across five platforms: Yelp, Google, Facebook, HomeAdvisor and the Better Business Bureau (BBB). Through his various analyses, he found that non-Yelp platforms exhibit signs of rating inflation, even if they’ve received a poor letter grade from the BBB or had numerous consumer complaints filed against them. Comparatively, his findings show that Yelp star ratings have the most uniform distribution across the five star spectrum, and that our review standards are consistently aligned with his benchmark of BBB ratings and consumer complaints.

Yelp Star Ratings are the Most Evenly Distributed

According to Raval’s research, businesses listed on the BBB exhibit a bimodal distribution (they either have more than four stars or less than two stars). On the Yelp platform, star ratings for reviews are more normal (they are evenly distributed). However, Google, Facebook and HomeAdvisor were all found to have an average rating distribution that heavily skewed toward higher star ratings. Raval likens this variance to the Lake Wobegon effect, or a tendency to believe that the respective platform’s businesses are all above average.

A separate 2018 study published in the Journal of Consumer Research found that four-star reviews can be more persuasive to potential customers than five-star reviews — increasing the likelihood of a purchase by 19%. In an interview with The Wall Street Journal, Boston University’s Associate Professor Daniella Kupor, explains, “We found that when people saw the four-star review, they thought that the reviewer was more thoughtful and that the reviewer’s evaluation was more accurate… A very thoughtful, moderately positive review can more greatly persuade people to purchase a product than an extremely positive review that clearly didn’t result from a lot of thought.”

Yelp Ratings Align with the Better Business Bureau’s “Low Quality Businesses”

Raval defines the quality of a business by the number of complaints filed to consumer protection organizations (such as the BBB or the FTC) or if the business received an F letter grade from the BBB.

The study found that as a business’s BBB letter grade declines, their rating on Yelp falls faster than on other platforms. In fact, Raval states, “On average, an F graded business on Google, Facebook, and HomeAdvisor has a higher [difference in] rating than an A+ graded business on the BBB.” The research also concludes that “a low quality business on Google has about the same average rating as a medium quality business on Yelp or a high quality business on the BBB’s platform.”

For businesses with more than 25 consumer complaints, Yelp was the only platform that had a slightly lower star rating than the BBB with a 0.1 star rating decrease. According to Raval, “in contrast to Yelp, the gap between the other platforms and the BBB rises… Google ratings are 1.25 stars higher than the BBB, Facebook ratings 1.37 stars higher, and HomeAdvisor ratings 1.16 stars higher.”

Yelp’s Recommendation Software Cleans Up Low Quality Businesses

Yelp uses an automated recommendation software to highlight the most useful and reliable reviews by continuously evaluating dozens of signals to weed out those that may be inauthentic, solicited, biased or written by users we just don’t know enough about.

While the report found that Google, Facebook, and HomeAdvisor tend to have more reviews for the same business than the BBB or Yelp, Raval’s investigation suggests this difference boils down to potentially fake reviews going unchecked on Google.

Raval suspects that more than one-fourth of Google reviews for low quality businesses are likely fake.

Review Text Adds Value to Consumers

It can be hard for consumers to settle on the right restaurant or a trustworthy plumber if there isn’t any review text to add color to a star rating. When platforms make review text optional, it raises the question of whether or not the quality of the reviews are actually helpful to consumers. Raval states, “requiring a reviewer to write text imposes greater costs on reviewers, which might reduce the quantity of reviews but increase their quality.”

In his analysis, Raval found that 45% of Google reviews are 100 characters or less, many of which contain no text (22% of Google reviews analyzed), providing little value to people on what their consumer experience could be like. On the other hand, only 4% of reviews on Yelp are less than 100 characters, while a majority contain between 501 and 1,000 characters (26% of Yelp reviews analyzed). Yelp has always required review text — in fact, the study finds Yelp reviews contain, on average, about 593 characters, which is more than double Google’s average count of 250 characters.

This report bolsters Yelp’s argument that Google’s self-serving bias and anticompetitive conduct could harm consumers when Google prioritizes its own lower quality content ahead of other, higher quality sources. Yelp ratings and reviews tend to mirror what people experience in the real world when they engage with local businesses — good, bad and in between. We believe this is the core strength of Yelp. Our content allows consumers to make informed spending decisions, and it levels the playing field for hardworking businesses that rightfully earn their great reputations. 

Learn more in FTC Economist Devesh Raval’s full study: Do Bad Businesses Get Good Reviews? Evidence from Online Review Platforms