FTC finalizes new rules around fake reviews
Offenders can be charged nearly $52k per violation, but it’s unclear how the rule will be enforced.
At an inaugural creator summit at the White House last week, President Biden told the assembled audience that they have more influence than traditional media. “You break through in ways that I think are going to change the entire dynamic of the way in which we communicate,” he said.
It was a high-level testament to the power of online voices. In the age of algorithms and opinions, tech platforms shape the way we feel about plenty of things — businesses like restaurants included.
The Federal Trade Commission, a government agency tasked with advocating for American consumers, knows this, too. Last week, it finalized its set of rules around false and fraudulent reviews, and they’re far reaching. The final list prohibits fake reviews, but also review-adjacent modern behaviors, like buying social media followers or intimidating customers to prevent or remove a negative review. It can seek a maximum penalty of just under $52,000 per violation.
The agency started exploring rulemaking in 2022, probably during an oddly secretive, closed-door meeting in San Francisco (that it declined to confirm it participated in). In the meeting, reps from Yelp, Tripadvisor, Trustpilot, Google, and other tech platforms discussed how to work together to tackle bogus reviews. The meeting wasn’t reported for months after it happened, which I suppose is helpful in hindsight — you can’t exactly propose new methods of combating nefarious behavior on the internet where bad actors can see them.
The full text of the FTC rule spells out prohibited behaviors. Among the details, the rule calls out celebrity testimonials (here’s looking at you, crypto) and also addresses fake reviews generated by artificial intelligence. It prohibits businesses from buying reviews, either from companies that charge for fraudulent reviews en masse, or by compensating customers for leaving a favorable review. (Think: A sign that reads “Show us your 5-star Yelp review for a discount.”) Mostly, the new rule provides insight into how the powers-that-be hope to tackle a pervasive problem for small business owners. It’s still unclear how the rule will actually be enforced.
Customer reviews have become big business for the companies that traffic in them.
Yelp, a company worth $2.2 billion, has been policing this behavior for a while. Last September, it added a landing page calling out violations of its own review ethics policies, including photo ‘evidence’ gathered from restaurants and other offending businesses. Peeking at the page today, it seems that Yelp is also going after paid influencer partnerships that require a Yelp review, even if it doesn’t explicitly ask for a good one.
“While Yelp’s policies have long prohibited practices outlined in the FTC’s final rule, we believe the enforcement of this new rule will improve the review landscape for consumers and help level the playing field for businesses,” Aaron Schur, Yelp’s general counsel, said in a statement.
“We've invested in both technology and human moderation to mitigate misinformation on Yelp, as well as built our platform around fighting fake reviews and other misleading behaviors through scalable solutions. For years, Yelp has also provided the FTC and other regulators with leads on deceptive review conduct.”
Importantly, the new rule doesn’t require platforms that publish reviews to make sure they’re legit.
Schur’s statement on behalf of Yelp (which is long and not published in its entirety here) also noted that it’s long worked to mitigate fake and otherwise unscrupulous reviews, and that the platform’s current policies go beyond what the FTC requires. But restaurant operators I’ve spoken with in the past have, at times, expressed frustration with online reviews and the platforms that support them. Two years ago, a bunch of restaurateurs were basically extorted via Google’s starred review system after people posted a string of one- or no-star reviews to some restaurants, essentially tanking their overall rating. Then, they wrote the restaurants and demanded (relatively small) amounts of money, paid via gift cards. It took a couple weeks, but Google eventually removed the offending reviews and the behavior — or at least one round of it — stopped.
I covered the issue for Bon Appetit in July 2022, interviewing several affected restaurant owners and questioning the point of online reviews that couldn’t be verified. They were, understandably, very frustrated. Occasionally, an interview yields a particular quote that sticks with me long after I’ve published a piece. This quote, from Chicago chef and restaurant owner Zoe Schor, is one of them:
“We put so much faith in these machines and these systems that are all so arbitrary and subjective. In the grand scheme of stupid things that restaurant owners have to deal with,” Schor said, “this is just one more stupid thing.”