Subtle Bias Can Derail Results

crowdsourcing

If Not Well Managed, Crowdsourcing Contests Produce Undesirable Results, UConn Researchers Discover

Crowdsourcing firms and platform designers may need to revisit their strategies, according to UConn School of Business researchers, because the competitive nature of the work, eager newcomers trying to promote themselves, and subtle biases in presentations may be skewing the outcomes.

“Without appropriate control, the behavior of the online community might not align with the platform’s designed objective, which can lead to an inferior platform performance,” said Professor Jan Stallaert, of the Operations and Information Department at the School of Business. His research, in conjunction with co-authors Sulin Ba, associate dean of the School of Business, Professor Xinxin Li and doctoral candidate Brian Lee, is uncovering new territory in the field of online business-problem resolution.

“The way in which people work these days is much different,” Stallaert said. “The workforce has changed, and instead of tasking employees to find solutions, companies are increasingly putting a problem on a website and testing the ideas that hundreds of people have for solving it.”

Jan Stallaert (UConn School of Business)
Jan Stallaert (UConn School of Business)

Crowdsourcing contests are widely used as a means for generating ideas and solving problems through the hosting of open contests online. Many business firms and government agencies, including GE, NASA, Procter and Gamble have used crowdsourcing as part of their research and development processes, incentivizing people with monetary rewards or peer recognition.

The crux of the UConn research investigates systemic bias, specifically the ‘salience effect’ which may overemphasize some information received by the contestants. That bias could steer the performance of every worker on the platform, the researchers said. Psychologists have also shown that individuals tend to be more confident about information they can retrieve quickly, favoring explicit over implicit information.

Sulin Ba (Melissa Ferrigno/UConn School of Business)
Sulin Ba (Melissa Ferrigno/UConn School of Business)

For instance, a corporate recruiter who reads a strong reference letter before examining a full application may form perceptions of the applicant without looking more closely at the resume, creating a “salience” effect that favors that applicant. A similar situation may occur in crowdsourcing competitions.

To conduct their research, the UConn team used archival data from Kaggle, a crowdsourcing platform for predictive modeling. Most problems there are hosted as a contest and the winners typically win money, a work opportunity or the chance to attend a conference. Every time a contestant uploads a solution, the system will show him/her their public score, with the highest posted on a public leaderboard.

“Sometimes the contest is successful, and other times the results may not be that great. Contestants adjust their solutions based on the ‘public score’ but that doesn’t tell you how well your recommendations would work if implemented, since a large part of the data is proprietary,” co-author Li said. “While the goal for contestants is to score well, the company’s end-goal is to find the best solution for a complex problem.”

Xinxin Li (Nathan Oldham/UConn School of Business)
Xinxin Li (Nathan Oldham/UConn School of Business)

The team studied 102 predictive modeling contests and 45,226 teams on Kaggle, and provides evidence of the salience effect among crowdsourcing workers and shows that it has a severe effect on crowdsourcing outcomes.

Two critical effects contribute to the quality of crowdsourcing outcomes: the parallel path effect, which refers to a greater likelihood of obtaining a desirable solution when the number of contestants increases; and the competition effect which suggests that by squaring off against more competitors, high-ability contestants are more likely to be motivated to put more effort into creating new solutions.

The UConn researchers demonstrate that the impact of the salience effect is amplified when the competition effect dominates, but the impact is attenuated when the parallel path prevails.

Brian Lee (UConn School of Business)
Brian Lee (UConn School of Business)

They also found that experienced contestants are less likely to be swayed by the salience effect and that the salience effect was stronger in teams who compete in contests with a larger reward. In fact, over-incentivizing could actually create an inferior outcome.

For a firm, investing in an innovation that is subject to systemic bias in crowdsourcing contests may result in wasted resources, said the researchers, whose work is currently under review with a prominent journal.

Stallaert and his team concluded that the salience effect was persistent among all contestants, including the winners and that competition amplified the influence of the salience effect on the winners. They also believe that both solution seekers and platform designers should be aware of the possibility of creating inferior outcomes through systemic bias. Platform designers should consider providing training to the contestants to be aware of the behavioral temptations that could change their workflow.

Discovering the pitfalls and benefits of using platforms will help assist future researchers and practitioners in integrating successful online platforms in their business strategies. “Experienced contestants may exhibit less cognitive bias and focus more on the final result,” Stallaert said. “In contrast to conventional wisdom, where a crowd functions at its peak performance when it comprises a diverse group of workers, platform designers should decide which contests are open only to experienced contestants to improve the solution outcomes.”