Among the best-selling books in Amazon’s Epidemiology category are several anti-vaccine tomes. One has a confident-looking doctor on the cover, but the author doesn’t have an MD—a quick Google search reveals that he’s a medical journalist with the “ThinkTwice Global Vaccine Institute.” Scrolling through a simple keyword search for “vaccine” in Amazon’s top-level Books section reveals anti-vax literature prominently marked as “#1 Best Seller” in categories ranging from Emergency Pediatrics to History of Medicine to Chemistry. The first pro-vaccine book appears 12th in the list. Bluntly named “Vaccines Did Not Cause Rachel’s Autism,” it’s the only pro-vaccine book on the first page of search results. Its author, the pediatrician Peter Hotez, a professor in the Departments of Pediatrics and Molecular Virology & Microbiology at the Baylor College of Medicine , has tweeted numerous times about the amount of abuse and Amazon review brigading that he’s had to fight since it was released.
Over in Amazon’s Oncology category, a book with a Best Seller label suggests juice as an alternative to chemotherapy. For the term “cancer” overall, coordinated review brigading appears to have ensured that “The Truth About Cancer,” a hodgepodge of claims about, among other things, government conspiracies, enjoys 1,684 reviews and front-page placement. A whopping 96 percent of the reviews are 5 stars—a measure that many Amazon customers use as a proxy for quality. However, a glance at Reviewmeta, a site that aims to help customers assess whether reviews are legitimate, suggests that over 1,000 may be suspicious in terms of time frame, language, and reviewer behavior.
Once relegated to tabloids and web forums, health misinformation and conspiracies have found a new megaphone in the curation engines that power massive platforms like Amazon, Facebook, and Google. Search, trending, and recommendation algorithms can be gamed to make fringe ideas appear mainstream. This is compounded by an asymmetry of passion that leads truther communities to create prolific amounts of content, resulting in a greater amount available for algorithms to serve up … and, it seems, resulting in real-world consequences.
. . . .
Over the past decade or so, we’ve become increasingly reliant on algorithmic curation. In an era of content glut, search results and ranked feeds shape everything from the articles we read and products we buy to the doctors or restaurants we choose. Recommendation engines influence new interests and social group formation. Trending algorithms show us what other people are paying attention to; they have the power to drive social conversations and, occasionally, social movements.
Curation algorithms are largely amoral. They’re engineered to show us things we are statistically likely to want to see, content that people similar to us have found engaging—even if it’s stuff that’s factually unreliable or potentially harmful. On social networks, these algorithms are optimized primarily to drive engagement. On Amazon, they’re intended to drive purchases. Amazon has several varieties of recommendation engine on each product page: “Customers also shopped for” suggestions are distinct from “customers who bought this item also bought”. There are “sponsored” products, which are essentially ads. And there’s “frequently bought together,” a feature that links products across categories (often very useful, occasionally somewhat disturbing). If you manage to leave the platform without purchasing anything, an email may follow a day later suggesting even more products.
Amazon shapes many of our consumption habits. It influences what millions of people buy, watch, read, and listen to each day. It’s the internet’s de facto product search engine—and because of the hundreds of millions of dollars that flow through the site daily, the incentive to game that search engine is high. Making it to the first page of results for a given product can be incredibly lucrative.
Unfortunately, many curation algorithms can be gamed in predictable ways, particularly when popularity is a key input. On Amazon, this often takes the form of dubious accounts coordinating to leave convincing positive (or negative) reviews. Sometimes sellers outright buy or otherwise incentivize review fraud; that’s a violation of Amazon’s terms of service, but enforcement is lax. Sometimes, as with the anti-vax movement and some alternative-health communities, large groups of true believers coordinate to catapult their preferred content into the first page of search results. (Amazon disputes this characterization and says they carefully police reviews.)
Amazon reviews appear to figure prominently in the company’s ranking algorithms. (The company will not confirm this.) Customers consider the number of stars and volume of reviews when deciding which products to buy; they’re seen as a proxy for quality. High ratings can lead to inadvertent free promotion: Amazon’s Prime Streaming video platform launched with a splash page that prominently featured Vaxxed, Andrew Wakefield’s movie devoted to the conspiracy theory that vaccines cause autism.
Perhaps compounding the problem, Amazon allows content creators to select their own categories and keywords.
. . . .
With a product base as large as Amazon’s, it’s probably a challenge for the company to undertake any kind of review process. This is likely why quackery shows up in classified as “Oncology” or “Chemistry.” It’s a small reminder that Amazon isn’t exactly a bookstore or library.
Link to the rest at Wired
Perhaps PG is mistaken, but his take on contemporary society in the US is that there are a great many people who are anxious to silence those with whom they disagree.
It appears to him that, other than a few types of content that are clearly illegal, that “violate laws or copyright, trademark, privacy, publicity, or other rights” plus pornography, Amazon is content to allow its audience the right to decide what it would and would not like to read.