Algorithms define Amazon, governing everything from what it displays on the home page to how it packs your parcel. But a new investigation suggests that its code isn’t necessarily working in the best interests of customers’ bank balances.
By studying 250 frequently purchased items, ProPublica figured out how the retailer recommends products for purchase. Perhaps unsurprisingly, they found that Amazon’s code places its own products, and those from Fulfilled by Amazon sellers, in what’s known as the Buy Box, a listing that pops up to offer a convenient purchase.
The ease of this clearly wins many people over: apparently the majority of shoppers buy the suggested item. But the cost difference between the algorithm-selected choice and the cheapest version available to buy elsewhere on the site was, on average, $7.88 for the 250 products, adding up to a 20 percent inflation across all the items.
When customers compare options using Amazon’s “price + shipping” listings, the site also omits delivery costs for its own products and those of Fulfilled by Amazon sellers. While that plays out just fine if you’re an Amazon Prime member or buy over $49 worth of goods, both of which secure free shipping on those items, others will notice later in the process that they have to pay for shipping. Include shipping for all the items, and the Amazon products slide down the rankings. Amazon tells ProPublica that its “vast selection, world-class customer service and fast, free delivery” are, along with price, important to its customers.
This isn’t the first time that Amazon has come under fire for its sales practices. Once upon a time, it toyed with dynamic pricing for DVDs, though it swiftly made a U-turn in the face of criticism. More recently, the rollout of its same-day delivery service sparked controversy when Prime members in poor and black neighborhoods were found to be getting left behind.
Predictably, Amazon didn’t provide ProbPublica with any details about how its algorithms work. In fact, we may never know—but our bank accounts might.
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
Data analytics reveal real business value
Sophisticated analytics tools mine insights from data, optimizing operational processes across the enterprise.
Driving companywide efficiencies with AI
Advanced AI and ML capabilities revolutionize how administrative and operations tasks are done.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.