Crowdsourcing competitions often hijacked: study

2011

By Charis Palmer, The Conversation

Crowdsourcing competitions, popular with companies seeking to tap into groups of knowledge, are often diminished by malicious behaviour, according to a new study.

The research, published today in the Journal of the Royal Society Interface, found the opennesss of crowdsourced competitions, particularly those with a “winner takes all” prize, made them vulnerable to attack.

The researchers used game theory to analyse the trade off between the potential for increased productivity from crowdsourcing a project, and the possibility of it being set back by malicious behaviour. They cited the DARPA Network Challenge as an example of a hijacked crowdsourcing competition, in which the organisers were left to sort through many fake submissions, including fabricated pictures of people impersonating DARPA officials.

Research leader and University of Southampton computer scientist Victor Naroditskiy said companies should consider malicious behaviour a cost of crowdsourcing, given the research showed it was the norm.

“Our work enhances the understanding of the strategic forces coming into play in crowdsourcing contests,” Naroditskiy said.

For example, the researchers found making attacks on crowdsourcing competitions more costly did not deter malicious behaviour, and simply led to more attacks by the weaker player in a contest.

However, Dr Lubna Alam, assistant professor in information systems at the University of Canberra, said in her experience the benefits of crowdsourcing outweighed the cost of managing attacks.

“There is a scepticism about crowdsourcing – people think when you involve crowds you run the risks of attacks and vandalism…but there are many instances, particularly in non-profit contexts; for example the National library’s newspaper digitisation project which had been running for more than five years and there hasn’t been any cases of vandalism.”

“If you trust the crowd, the crowd will trust you back and will give you in return much more than what you expected.”

Dr Alam said the competition and contest part of crowdsourcing had gained momentum in Australia.

“It’s still early days in terms of costing it, but people mostly use it when they don’t have the expertise within the team or organisation, as a way of tapping into expertise from outside.”

Joseph Davis, professor of Information Systems and Services at the University of Sydney, said the results were generally useful at an abstract level.

“However, it is important to remember that the research is applicable to only a small subset of crowdsourcing, that is adversarial crowdsourcing contests,” Prof Davis said.

“The findings are interesting, but the relevance to the broader context of crowdsourcing is somewhat limited.”

Professor Davis said crowdsourcing contest sites like Innocentive) where there was no adversarial competition did not face the same problem of malicious behaviour.

“There are obvious quality control issues in all types of crowdsourcing, but effective mechanisms can be put in place to deal with them.”

The Conversation

This article was originally published on The Conversation.
Read the original article.

LEAVE A REPLY