How humans perceive and interact with algorithms matters. A growing concern for businesses is that algorithms may reproduce or even magnify inequality in the workplace.
Associate Professor Dennis J. Zhang and coauthors researched how people perceive the fairness of algorithmic decisions in comparison with human decisions. Zhang discussed the findings last week at Olin’s virtual Fall 2021 Business Research Series.
The researchers studied how a task assignment from an algorithm, versus one from a person, changes workers’ perceptions of fairness—and changes the workers’ productivity. For instance, a worker might feel treated unfairly if he thinks he’s asked to pick too many items or if he thinks someone is playing favorites.
They conducted a 15-day randomized field experiment in 2019 with Alibaba Group in a warehouse where workers pick products to box based on orders known as “pickbills.” The experiment involved 50 workers randomly assigned to one of two groups.
What goes in which box?
Retailing platforms such as Alibaba, the largest online retail platform in China, use algorithms to determine which set of items will be packed into which box. They need humans to then follow the algorithmic prescriptions to pack the items. Basically, workers follow an order to pick certain items from different shelves that belong to a specific set of packages. In the past, people distributed pickbills to workers in Alibaba’s warehouses. But Alibaba had started assigning pickbills algorithmically, in which case workers could receive picking tasks by scanning a bar-code with their radio-frequency hand-held monitor.
Alibaba’s interest in switching to an algorithmic task assignment process was mainly motivated by hope it could improve efficiency. The researchers, however, wondered whether this change could also bring productivity benefits by changing workers’ perceptions of fairness.
Half of the workers in the field experiment were randomly assigned pickbills from a machine that ostensibly relied on an algorithm. The other half received pickbills from a person who secretly followed the assignments from the same algorithm.
The researchers studied the question of how people perceive the fairness of algorithmic judgment and how such perceptions affect their behavior. During the experiment, they collected performance data about 4,486 pickbills along with 108 daily questionnaires from workers.
Results surprise researchers
As it turned out, workers perceived the algorithmic assignment process as fairer than the human process.
“That’s one of the results that actually surprised us the most,” Zhang said. In existing literature, “there is a huge discussion” about how algorithms are not fair. “But what we actually show is in simple settings workers actually think algorithms are more fair.”
Receiving tasks from an algorithm (relative to a human) also significantly increased workers’ picking efficiency by 17-19%.
The productivity gain from the algorithmic assignment was larger for more educated workers and workers who cared more about the difficulty of their pickbills, groups for which perceived fairness has a stronger effect on productivity.
The idea of algorithms being fairer than humans can be generalized in other settings where workers are evaluated and affected by algorithms and can be used to create more fair and productive working environments, the authors say. “The Impacts of Algorithmic Work Assignment on Fairness Perceptions and Productivity” is under review at Manufacturing & Services Operation Management.