Just a Face in the Crowd

Crowdsourcing to to Tame the Medical Literature?

Medline Articles per YearMedical literature is growing beyond the grasp of individuals trying to comprehend its breadth. PubMed now holds more than 23 million records. Crowdsourcing may be a necessary way of coping. Andrew Brown and David Allison offer up some evidence in a new PLOS ONE publication that it can work.

Using Amazon’s MTurk service, they doled out four groups of tasks to “microworkers” for the purpose of evaluating the literature on nutrition and obesity. The tasks all related to a central question: Do nutrition and obesity studies with conclusions that agree with popular opinion get more attention in the scientific community than do those that don’t?

The individual tasks that one microworker completed were small. The cost of the tasks was trivial, $312.61, even after adjusting the rates to reflect minimum wages. But by dividing the larger task up and completing it through crowdsourcing, considerable time was saved. Brown and Allison conclude:

Manually and continuously evaluating literature may be manageable for specific subtopics, but may be unwieldy even for a topic as specific as obesity (>18,000 papers in PubMed indexed in 2012 alone). Crowdsourcing can save calendar time by increasing the number of individuals working on a given task at one time.

Appropriately constructed and controlled crowdsourcing has potential to help improve the timeliness of research evaluation and synthesis.

“Alone we can do so little; together we can do so much.” — Helen Keller

Click here to read the study in PLOS ONE, click here to read musings on the growth of medical literature in 1913, click here to read more about MTurk, and click here to read more about crowdsourcing in the scientific method.

Just a Face in the Crowd, photograph © Scott Cresswell / flickr

Subscribe by email to follow the accumulating evidence and observations that shape our view of health, obesity, and policy.