Dennis Zhang, Olin associate professor of supply chain and technology, has been named a co-winner of the 2022 Production and Operations Management Society (POMS) Early Career Research Accomplishments.
The award is one of the most prestigious honors in the operations management field. Zhang received the award in April during the POMS conference, which was virtual because of COVID concerns.
“I am honored to receive this recognition,” Zhang said. “This is very motivating, and I hope to continue contributing to the field through research and service.”
POMS chose Zhang based on his contribution to platform operations, especially retail platform operations, as well as his contribution to data-driven methodologies in operations, such as field experiments and applied machine learning.
Zhang joined the Olin Business School in 2016. His research focuses on operations in innovative marketplaces and in the public sector. He has built theoretical models to extract reliable insights from data and uses data to improve existing models. Before he joined the Olin faculty, he finished his PhD at Northwestern University and worked at Google as a machine learning software engineer.
The award co-winners are Ruomeng Cui of Emory University and Hummy Song of the University of Pennsylvania.
How humans perceive and interact with algorithms matters. A growing concern for businesses is that algorithms may reproduce or even magnify inequality in the workplace.
Associate Professor Dennis J. Zhang and coauthors researched how people perceive the fairness of algorithmic decisions in comparison with human decisions. Zhang discussed the findings last week at Olin’s virtual Fall 2021 Business Research Series.
The researchers studied how a task assignment from an algorithm, versus one from a person, changes workers’ perceptions of fairness—and changes the workers’ productivity. For instance, a worker might feel treated unfairly if he thinks he’s asked to pick too many items or if he thinks someone is playing favorites.
They conducted a 15-day randomized field experiment in 2019 with Alibaba Group in a warehouse where workers pick products to box based on orders known as “pickbills.” The experiment involved 50 workers randomly assigned to one of two groups.
What goes in which box?
Retailing platforms such as Alibaba, the largest online retail platform in China, use algorithms to determine which set of items will be packed into which box. They need humans to then follow the algorithmic prescriptions to pack the items. Basically, workers follow an order to pick certain items from different shelves that belong to a specific set of packages. In the past, people distributed pickbills to workers in Alibaba’s warehouses. But Alibaba had started assigning pickbills algorithmically, in which case workers could receive picking tasks by scanning a bar-code with their radio-frequency hand-held monitor.
Alibaba’s interest in switching to an algorithmic task assignment process was mainly motivated by hope it could improve efficiency. The researchers, however, wondered whether this change could also bring productivity benefits by changing workers’ perceptions of fairness.
Half of the workers in the field experiment were randomly assigned pickbills from a machine that ostensibly relied on an algorithm. The other half received pickbills from a person who secretly followed the assignments from the same algorithm.
The researchers studied the question of how people perceive the fairness of algorithmic judgment and how such perceptions affect their behavior. During the experiment, they collected performance data about 4,486 pickbills along with 108 daily questionnaires from workers.
Results surprise researchers
As it turned out, workers perceived the algorithmic assignment process as fairer than the human process.
“That’s one of the results that actually surprised us the most,” Zhang said. In existing literature, “there is a huge discussion” about how algorithms are not fair. “But what we actually show is in simple settings workers actually think algorithms are more fair.”
Receiving tasks from an algorithm (relative to a human) also significantly increased workers’ picking efficiency by 17-19%.
The productivity gain from the algorithmic assignment was larger for more educated workers and workers who cared more about the difficulty of their pickbills, groups for which perceived fairness has a stronger effect on productivity.
The idea of algorithms being fairer than humans can be generalized in other settings where workers are evaluated and affected by algorithms and can be used to create more fair and productive working environments, the authors say. “The Impacts of Algorithmic Work Assignment on Fairness Perceptions and Productivity” is under review at Manufacturing & Services Operation Management.
The Olin Award, which includes business school recognition and a $10,000 prize, is intended to promote scholarly research that has timely, practical applications for complex management problems.
Conventional bin-packing algorithms prescribe which items to pack in which sequence in which box. They focus on the best use of a box’s volume. But here’s the problem: Those algorithms tend to overlook how humans might deviate from instructions and create delays. Workers might not organize items as the algorithm prescribes if, for instance, packing a box is complex because it includes numerous items or items with unusual shapes.
“It takes the algorithm and the executors of the algorithm—the people delivering the outcomes,” said Zhang, assistant professor of operations and manufacturing management. “I call this particular characteristic artificial intelligence and human collaboration. Such particular characteristics allow us to design better algorithms.”
Here a some takeaways from the research:
The new algorithm predicts which orders will confuse workers and adjusts the box size to a larger one.
The cost of box material may increase, but more savings come from fewer packing delays.
AI and robotics can improve human work by providing more support for the decisions people make while working.
Last year, Zhang and Jake Feldman, assistant professor of operations and manufacturing management, received the award. They used data from Alibaba to test the benefits of—and recommend a solution for—presenting buyers the optimum variety of products available for purchase with individual online retail stores.
In this year’s winning research, the idea is not to strive for autonomous automation, the authors wrote. “We believe that AI and robotics can improve human work by providing more decision support while always empowering human judgment, oversight and discretion.”
Zhang presented his research virtually to members of the Olin community and guests on June 25.
Coauthors of the paper are Jiankun Sun of Imperial College Business School, Haoyuan Hu of the Alibaba Group and Jan A. Van Mieghem of Northwestern University.
As companies trim their hierarchies and form teams of employees to manage themselves, WashU Olin researchers are sounding warning bells.
Inequity “is likely to be a significant problem”—especially for women, who made one-fourth less than their male counterparts in a new study of self-managed teams.
Zappos, Google, Facebook and others have adopted them. The teams are meant to boost productivity, offer flexibility, attract young people and foster creativity. Ideally, they allocate tasks based on employees’ strengths and then assign rewards—equitably—based on their contributions.
But how well do the teams actually work?
“Inherently, they aren’t as awesome as people think,” said Pierce, professor of organization and strategy and associate dean of the Olin-Brookings Partnership. (Zhang, professor of operations and manufacturing management, was in Beijing and unavailable for an interview.)
Finding: Women were paid 24% less than men
For 50 months, Pierce, Zhang and Wang studied productivity and bargaining traits in a service operation setting: a chain of 32 large beauty salons with 932 workers in China. About half (54%) of the workers were men.
They found that the men consistently extracted “advantageous bargaining values from their female coworkers, despite having no observable productivity advantage.”
In fact, women in the sample earned at least 24% less than their equally productive male counterparts.
That gender pay gap is larger than the pay gaps found in places with hierarchical management structures. A 2005 study found a gap of 10% in Asia and larger inequities in the United States and Europe.
The new evidence on self-managed teams has implications for US organizations.
“You see these dynamics playing out in Silicon Valley all the time. You see them playing out in academia,” Pierce said.
“Social interactions between men and women have consistencies across culture, across economic class, across age,” he said. “Show me the culture where women don’t tend to get worse negotiation or bargaining outcomes.”
‘Stuck with a bunch of overpaid men’
A combination of higher “prosociality” and lower bargaining power in women most likely explains the 24% wage disparity, the researchers report. Prosocial behavior includes feeling concern for others and acting to benefit them.
When the workers divided their own team-based compensation, women were severely underpaid for their productivity. Consider this: Women were the top salespeople in the beauty salons, and those women took home only the median wage.
“This is really bad because they’re going to leave, and when they leave then (the company is) going to be stuck with a bunch of overpaid men,” Pierce said.
The researchers compiled data from three sources between April 2009 and May 2013: Point-of-sale from each salon, which included every service and card-for-service transaction; the internal human resource system that included the commission paid to each worker for each transaction; and detailed demographic information of each worker in those transactions.
They found that gender “strongly predicts” under- or overcompensation relative to productivity. Men made up a disproportionate number of highly paid yet unproductive workers. Women overrepresented “star employees” with poor bargaining outcomes.
The researchers used a previously proven algorithm to confirm gender as the strongest predictor of bargaining outcomes.
Out of sight, out of mind
Pierce emphasized that self-managed teams can work well—with clear guidelines in place. The study highlights how important it is to monitor and enforce pay equity in self-managed teams. After all, those teams assign tasks, responsibilities and rewards for teammates.
“Because this design decision effectively puts inequity ‘out of sight for the manager,’ it may also put this inequity ‘out of mind,’” the researchers write. But that doesn’t abdicate a manager’s responsibility to stem bias and discrimination. One solution is to set up formal rules for the assignments of tasks and rewards, the findings suggest.
“Do you have safeguards in place to ensure that the person who gets all the credit, who gets the rewards, who gets the best task is not simply the one who bargains the best or bargains the most aggressively?” Pierce asked.
The research findings also imply that firms could cut costs by replacing overpaid workers. And the findings show that good workers who are underpaid lose motivation—and often leave.
“Managers must anticipate and mitigate this gender-based inequity,” the researchers write. That’s because it is an operational performance issue. And “because of the myriad of productivity, retention and ethical implications that can result from peer-based bargaining.”