This week, Amazon acknowledged reality: It has a problem with bogus reviews.
The trouble is that Amazon pointed blame at almost everyone involved in untrustworthy ratings, and not nearly enough at the company itself. Amazon criticized Facebook, but it didn’t recognize that the two companies share an underlying problem that risks eroding people’s confidence in their services: an inability to effectively police their sprawling websites.
Learning from the masses is a promise of the digital age that hasn’t panned out. It can be wonderful to evaluate others’ feedback before we buy a product, book a hotel or see a doctor. But it’s so common and lucrative for companies and services to pay for or otherwise manipulate ratings on all sorts of websites that it’s hard to trust anything we see.
The persistence of bogus reviews raises two big questions for Amazon: How much attention does it really devote to stopping bogus customer feedback? And would shoppers be better off if Amazon re-evaluated its essence as an (almost) anything-goes online bazaar?
Amazon’s rules prohibit companies from offering people money or other incentives for reviews. Amazon says that it catches most bogus ratings and works to stay ahead of rule breakers. Still, the global industry of review fraud operates actively on Amazon and everyone knows it.
Amazon seems to have been prodded by the Federal Trade Commission, according to Vox’s Recode publication, and by journalists into taking some action to crack down on manipulated ratings.
After a Wall Street Journal columnist wrote this week about buying a RAVPower electrical charger that came with a postcard offering a $35 gift card in exchange for a review, the vendor said on Thursday that it had been banned from Amazon. (The statement is in Chinese, and I read it via Google Translate.) That followed bans on several other large sellers that appeared to have been buying reviews for years.
If government lawyers and newspaper columnists spot sellers openly manipulating reviews, how hard is the company looking for them?
Maybe you’re thinking that this is just how the world works: Caveat emptor. When I read ratings of products on Amazon or of physicians on Zocdoc, the feedback is helpful but I take it with a grain of salt.
But unfortunately lots of people are harmed by bogus reviews, and they’re not always easy for us to spot. The Washington Post recently wrote about a family fooled by bought-off Google ratings for an alcohol addiction treatment center. I wrote last year about research that found that Amazon catches many bought-off reviews, but only months later and after shoppers showed signs of feeling misled into buying a product.
I wish that Amazon would take more responsibility for the problem. In its statement this week, the company blamed social media companies and poor enforcement by regulators for bogus reviews. Amazon has a point. Fraudulent online ratings are a big business with many enablers. Facebook and China’s WeChat app don’t do enough about forums where companies coordinate review manipulation.
But Amazon didn’t say much about what it could do differently. For example, the University of California researchers I spoke with last fall found that bought-off reviews were far more common among Chinese vendors and for products for which there were many vendors selling a nearly identical product. Maybe that means that Amazon should more closely police sellers based in China? Or that it would be helpful to cap the number of sellers that list the same bathroom caddy?
Strong reviews also help sellers appear prominently when we search for products on Amazon, which creates a huge financial incentive to cheat. Should Amazon reconsider how it accounts for ratings in search results? The company didn’t say.
Most of all, it’s disappointing that Amazon doesn’t acknowledge that bogus reviews are a consequence of its choice to opt for quantity over quality.
People can buy almost anything on Amazon and from almost any seller. That can be great for shoppers, but it comes with trade-offs. Being an everything store — and one that tries to operate with as little human intervention as possible — makes it harder for Amazon to root out fake or dangerous products and bought-off reviews.
Before we go …
No more “speed filter.” NPR reports that Snapchat will phase out an app feature that lets people record and share how fast they’re driving. Road safety advocates say that the feature for years has encouraged young people to drive recklessly to get bragging rights.
Using WhatsApp to bust myths: During the pandemic, government health care workers in rural India have been using WhatsApp to counter misinformation about the virus, The Verge reports. It takes a lot of time for health care workers to fact check information on the app, but the online messages as well as in-person conversations seem to be keeping many people safe.
LOOK AT THE GIANT BUNNY: My colleague Amanda Hess spoke with people who post online videos of their numerous and exotic animals. The niche called Pet Tube caters to our love for sight gags like a pile of snakes slithering on a piano, but these people also love animals — “even potentially revolting swarms of animals,” Amanda wrote.
Hugs to this
A baby seal tests out the water. The little one shifts from uncertain to glee in a flash.