Go on TripAdvisor’s UK site and you can no longer find the “reviews you can trust” slogan anywhere on the page (it still exists in the US, in tiny grey print near the bottom of the page.) Legally forced to remove the phrase, soon they might just have to say “a lot of fabricated reviews you can sort of trust” as new research shows that about 3.6 million of TripAdvisor’s reviews are likely fake.
TripAdvisor’s not the only culprit; Yelp has been hit with several lawsuits over defamation and false reviews too. Second maybe only to online commenters, online reviewers (Yelpers in particular) can be some of the worst people on the Internet — it seems like the only time anyone is compelled to write a review is when they’ve just had THE WORST EXPERIENCE of their life:
But as easy as it is to spot (and ignore) a ranting lunatic, apparently we’re pretty bad at distinguishing a fake review from a real one — only slightly better than chance, the researchers found. The motivations for writing a fake review are obvious, and the methods simple; companies can simply write their own or pay third parties to do it for them. Which poses a problem: If we can hardly tell the difference between honest Abe and some sketchy company hired to boost ratings, what good are “consumer” reviews anyway?
The Cornell computer scientists initially set out to pinpoint linguistic clues or patterns that could help users flag fake reviews. Realizing that scammers might use their research for evil, and that keywords would be limited to the domain (camera reviewers don’t use the same language as restaurant reviewers), they decided to look closer at the sites, not the scammers, to determine how and where fake online reviewers thrive.
In the same way that epidemiologists estimate cancer growth rates from a sample population, the researchers estimated the rate of growth for one particular type of lie — a positive review from someone that’s never been to that place before (internal fake reviews are much harder to detect) — and found that the rate of deceptive consumer reviews has been steadily increasing on sites like TripAdvisor and Yelp.
And it makes sense — like most things in life, deception is most common when it’s easy to do and the rewards are greatest. On TripAdvisor and Yelp, there are few hurdles for new reviewers, making it easier for the scammers to get through. Combined with the fact that these sites also depend heavily on their reviewers, TripAdvisor’s rate of phony reviews has increased to almost six percent and Yelp’s to almost four percent since 2009, the researchers found.
“Six percent doesn’t seem like much, but out of the 60 million reviews on TripAdvisor, that’s 3.6 million fake reviews,” Jeff Hancock, one of the researchers, told me. Out of the six sites that Hancock and his team analyzed — Yelp, TripAdvisor, Hotels.com, Expedia, Priceline and Orbitz — only Yelp and Trip Advisor’s rates of fake reviews are increasing, the other four sites all hover right around two percent.
That’s still a pretty solid amount of spam out there, so the researchers decided to create a free plugin for browsers called Review Skeptic that screens for fake reviews (only in beta, and only for TripAdvisor right now) based on their original research that “uses language models to spot fake reviews with nearly 90% accuracy.” As you’re scanning TripAdvisor, the plugin automatically labels each review as “suspect” or “real”:
If this is actually real, this is the creepiest hotel concierge ever.
TripAdvisor’s director of content integrity, Andrew Marane, assured me that “the vast majority of content that comes into TripAdvisor is from actual users with real experiences,” and that their algorithm uses factors like geography, history, and the reputation of the user to screen for fraudulent reviews. When prompted about about whether or not TripAdvisor has considered implementing stricter posting standards, i.e. requiring proof of a hotel stay, Marane told me that allowing anyone to review is integral to TripAdvisor.
“There’s real value it not limiting who can express their opinion about their experience based on how they paid for it or where they booked it,” he said, arguing that booking sites that require unique proof of payment would prevent a husband and wife, or the entire family, from reviewing.
I think that’s kind of the point? Making it harder to write a review makes it harder to write a fake review. I’d rather have a few truthful, legitimate reviews from people I know have been to said place than hundreds of people that may or may not have — even if that means I’m not going to get a review from every single member of the family (I trust that the person taking the time to write a review can capture the group sentiment.)
TripAdvisor isn’t opposed to implementing something like Review Skeptic, says Marane, if they find it’s the most effective way of detecting fraud. But I get the feeling that their whole “community” approach to reviewing might be tainted when a truthful reviewer is labeled SUSPECT — nobody likes being called a liar.