At Amazon, it’s the worker vs the algorithm
- It’s a grim glimpse of a future in which AI is the boss — and employees are disposable.
- Amazon uses a computer system to automatically track and fire hundreds of fulfillment center employees for failing to meet productivity quotas.
For decades, many have warned of a looming automation crisis — one where workers are gradually, then all at once, replaced by intelligent machines. Those warnings however mask the fact that an automation crisis has already arrived. The robots are here, they’re working in management, and they are even determining if workers stay or go. Within the walls of e-commerce and logistics giant Amazon for instance, an AI-driven algorithm is used to determine when an Amazon Flex driver has outlived their usefulness and needs to be let go.
As dystopian as it sounds, according to an investigation by Bloomberg, Amazon is dealing with its Flex drivers in such ways. The online retail giant has recently replaced its middle management and human resources workers with artificial intelligence that monitors, surveils, and might even hire and fire the Amazon flex driver. The Amazon Flex driver is considered to be a “gig” worker, you see, who handle packages that haven’t made it onto an Amazon van but need to be delivered the same day.
To begin with, Amazon became the world’s largest online retailer in part by outsourcing its sprawling operations to machine-based algorithms — sets of computer code designed to solve specific problems. For years, the company has used algorithms to manage the millions of third-party merchants on its online marketplace, drawing complaints that sellers have been booted off after being falsely accused of selling counterfeit goods and jacking up prices.
Not Amazon HR, but the algorithm tracking workflow
In the case of fulfillment leader Amazon, the algorithm received information about the times the drivers were active, how many deliveries they made in that time, and whether delivered packages fell victim to theft by so-called “porch pirates”. These numbers were crunched into a rating for each individual driver. One too many bad ratings and the driver could expect to get the mail that told them their services were no longer needed.
Bloomberg interviewed 15 Flex drivers, including four who say they were wrongly terminated, as well as former Amazon managers who say the largely automated system is insufficiently aware of the variables and challenges drivers face every day. Frankly speaking, it is not even the blind trust in the algorithm that is infuriating. It’s the shrug when such a life-changing decision is left to a machine.
Bloomberg says many Amazon Flex drivers did not take their dispute to arbitration because of a US$200 fee and little expectation of success. In doing that, they may also have denied the algorithm the kind of “false positive” data it would need in order to improve. Former Amazon managers who spoke to Bloomberg accuse their old employer of knowing that delegating work to algorithms would lead to mistakes and damaging headlines. Instead, they say, Amazon decided it was cheaper to trust the algorithms than pay people to investigate mistaken firings, so long as the drivers could be replaced easily.
Should we fear such an approach?
Now, it is predicted that robots could replace as many as two million more workers in manufacturing alone by 2025, according to a recent paper by economists at MIT and Boston University. The World Economic Forum (WEF) also concluded in a recent report that “a new generation of smart machines, fueled by rapid advances in artificial intelligence (AI) and robotics, could potentially replace a large proportion of existing human jobs.”
Experts have called for regulations forcing companies to be transparent about how algorithms affect people, giving them the information they need to call out and correct mistakes. Legislators have studied the matter but have been slow to enact rules to prevent harm. Just in December, Senator Chris Coons of Delaware introduced the Algorithmic Fairness Act. The act requires the Federal Trade Commission to create rules that ensure algorithms are being used equitably and that those affected by their decisions are informed and have the opportunity to reverse mistakes.