Five-star fakery: how online reviews may not be as trusting…


Ever since the internet gave us the opportunity to air our opinions on products and services, it’s been evident that many of those points of view aren’t as honest as they seem. Whether it’s authors using pseudonyms to give glowing compliments to their own books, or companies slandering products made by their competitors, the world of online reviews has become another battleground in the war on misinformation. A recent investigation by the British consumer rights watchdog Which? uncovered a network of Facebook groups through which companies actively offer free products in exchange for glowing reviews on Amazon’s website – a Payola scheme for the 21st century.

“We only infiltrated a handful of groups and the number of people involved was dizzying. There are so many more out there that we just don’t know about, and it’s very worrying,” says Which?’s Adam French.

How the scam works

This particular scam operates very simply. The company asks a potential reviewer to buy a product on Amazon and leave a five-star review, at which point they issue a refund of the cost of that product via Paypal. The person gets their free product, the company gets its glowing review, but for consumers who rely on this feedback to make choices, the result is manipulation on a grand scale. User reviews have become hugely important in an era where there’s growing ­cynicism about traditional marketing ­techniques – indeed, a Which? survey found that 97 per cent of us use them.

According to another study, three negative reviews are enough to dissuade 63 per cent of people from making a purchase. In ­addition to this, Harvard Business School has found that restaurants improving their ranking by one star on a particular reviews site leads to a 5 per cent increase in revenues. In short, feedback matters.

With such close attention paid to our opinions, we theoretically have more power than we ever did to inform each other, and in turn that should allow the best products and services to flourish. “It’s social proof, isn’t it?” says French. “It’s evidence from your peer group. Unfortunately, shady marketeers exploit that trust, and the result is people getting misled into buying things they wouldn’t have bought otherwise. Millions of pounds have been wasted on shoddy products because of fake reviews.” A 2015 briefing by the European Parliament put the proportion of fake online reviews as high as 16 per cent.

For decades, consumer decisions were based either on personal experience, word of mouth or advertising. The internet ripped a huge hole in that model, and some firms, terrified at their sudden loss of control, began to make poor decisions of their own. Last year, a Manhattan jeweller served time in prison after creating forged court orders to compel Google to remove bad reviews of his business from its index. Earlier this year, Australian property firm Meriton was fined A$3 million (Dh7.9m) for manipulating the TripAdvisor review process, thereby lessening the likelihood of negative reviews. “There will always be those few companies that wish to undermine the principles of an open ­environment in order to manipulate and mislead, providing a rose-tinted view of their world,” says Dave Robertson from review site Trustpilot.

How to combat it

Rather than risk getting caught red-handed, some companies opt for the strategy uncovered by Which?, i e, offering incentives to consumers to manipulate the reviews system on their behalf. That black market has existed for many years, and has subtly changed its tactics as websites such as Amazon, Trustpilot and TripAdvisor developed new methods of detecting suspicious patterns.

If you’re wise to those patterns, you can spot some of them yourself; for example, reviewers posting dozens of reviews a week for a seemingly random selection of products, e.g. books on the menopause, C+ programming, self defense, raw food diets, Buddhism, penny stocks and container gardening. Huge amounts are now invested by websites to stop their reviews being infiltrated in this way. “We have AI that looks at hundreds of data points simultaneously and a team of 50 investigators reviewing 50,000 tickets a month to prevent people gaming our platform,” says Robertson.

__________________

Read more:

‘Merging man and machine doesn’t come without consequences’

Why do people love unboxing videos?

How face filters on phone apps are leading teens to get plastic surgery

__________________

On one level, then, it’s a battle being fought by computer boffins trying to sidestep the measures being used to detect them. One 2013 investigation found teams of computer science specialists in India, Bangladesh and Indonesia bidding for such work on freelance websites. But many other techniques have been deployed to help fight the menace; from naming and shaming companies found to be involved in fraud, to schemes that allow users to flag suspicious reviews themselves, to local regulators imposing large fines. This summer, the problem was even addressed by the International Organisation for Standardisation (ISO), when it introduced a new standard, 20488, governing the moderation and publication of online assessments.

The “rewards for reviews” scheme, however, involves very little subterfuge and is hard for websites to detect. “We’ve seen Google acting hard to stop people gaming its search engine,” says French, “and so within the technology space there are probably a lot of learnings about how you battle this problem – but I don’t think there’s an overnight fix. Alongside gradual improvement from the the tech industry, it really needs a change in consumer behaviour.”

Staying wary of information online

In effect, the cynicism that we have developed about advertising also needs to be applied to online reviews, which have inevitably become a hugely important marketing platform. Among the tell-tale signs to look out for are products with reviews containing wild differences of opinion (lots of one-star and five-star reviews), dates on which reviews were posted (many in a short space of time could indicate an orchestrated campaign) and reviewing histories of individuals, including whether they tend to have extreme opinions, and what kind of products they’re reviewing.

For its part, Trustpilot would welcome regulatory support that doesn’t infringe freedom of speech. “Trying to manipulate and mislead consumers should be criminal,” says Robertson, “and we fully support the recent outcome by an Italian court to prosecute the owner of a business selling fake reviews on TripAdvisor.” Our role, meanwhile, is to be wary of the information we encounter online, and not to make snap purchasing decisions based on reviews that might simply not be true.

Algolia Reports

admin

leave a comment

Create Account



Log In Your Account