Posted on March 6, 2015
Crowdsourcing is a relatively new trend that, amongst many other applications, uses communication technology to the benefit of science. It is a really useful scientific help in these days of enormous quantities of incoming data and associated journal publications. The demands of scientific research are growing immensely each day, such that we need to recognize when this investigation is being carried out sensibly, and when it can be improved.
The need to evaluate scads of published research literature, or of otherwise unmanageable quantities of raw data, eventually justifies the use of human help. In crowdsourcing, this has led to the development of bitesize Human Intelligence Tasks (HITs, think Amazon’s Mechanical Turk project). Before crowdsourcing, the production of systematic reviews of research data required teams of authors and information specialists to personally screen citations. This meant a great effort in order to fulfil time expectations and money availability. The past screeners should be considered in some way as heroes: lots of work, little time and on occasions poor remuneration and recognition.
Through crowdsourcing, scientific projects are able to count on really interested members of the public- anonymous heroes throughout the world helping with other people’s research. Furthermore, it’s being found that these people are able to perform the same quality of work as an expert in the topic, depending on the motivation, rewards, and nature of the task involved.
Jeff Howe and Mark Robinson coined the term “Crowdsourcing” in 2005 as a name to describe the outsourcing of work to individuals.
People who help are also known as microworkers. They are involved in a complex structure; they actually are the bricks of a monumental collective effort.
Crowdsourcing has been used in science to:
Crowdsourcing, or ‘citizen science’ as it is often called, has also been shown to improve quality, cost, speed and the need for different opinions in the research process.
The easy access to the internet is one of the factors that makes this process feasible. Opportunities can be sought by people overseas because there is often no need to determine which population gets involved. Therefore, open recruitment lets people feel attach without pressure and, surprisingly, people are further committed to the process because of their anonymous, detached status.
Advantages of crowdsourcing platforms:
Crowdsourcing platforms can be chosen from this website: www.crowdsourcing.org, which is a neutral organization dedicated to crowdsourcing and crowdfunding. It is recognized worldwide, and it deserves that credit, because it hosts a vast collection of providers.
Crowdsourcing also has its own dilemmas. In order to decrease the number of mistakes, projects that use crowdsourcing as a tool require precise design and quality control. Also, health research requires high standards, privacy security, and leaders with a very strong grounding in the study area. Other dead spots that need to be solved are: rewards, author recognition, reliability of checkpoints, and process systematization.
Brown AW, Allison DB. Using Crowdsourcing to Evaluate Published Scientific Literature: Methods and Example. Larivière V, editor. PLoS ONE [Internet]. Public Library of Science (PLoS); 2014 Jul 2;9(7):e100647. Available from: http://dx.doi.org/10.1371/journal.pone.0100647
Mason W, Suri S. Conducting behavioral research on Amazon’s Mechanical Turk. Behavior Research Methods [Internet]. Springer Science + Business Media; 2011 Jun 30;44(1):1–23. Available from: http://dx.doi.org/10.3758/s13428-011-0124-6
Parvanta C, Roth Y, Keller H. Crowdsourcing 101: A Few Basics to Make You the Leader of the Pack. Health Promotion Practice [Internet]. SAGE Publications; 2013 Jan 8;14(2):163–7. Available from: http://dx.doi.org/10.1177/1524839912470654
Ranard BL, Ha YP, Meisel ZF, Asch DA, Hill SS, Becker LB, et al. Crowdsourcing—Harnessing the Masses to Advance Health and Medicine, a Systematic Review. Journal of General Internal Medicine [Internet]. Springer Science + Business Media; 2013 Jul 11;29(1):187–203. Available from: http://dx.doi.org/10.1007/s11606-013-2536-8
Samimi P, Ravana SD. Creation of Reliable Relevance Judgments in Information Retrieval Systems Evaluation Experimentation through Crowdsourcing: A Review. The Scientific World Journal [Internet]. Hindawi Publishing Corporation; 2014;2014:1–13. Available from: http://dx.doi.org/10.1155/2014/135641