crowdsourcing

Cochrane and Crowdsourcing: an interview with Anna-Noel Storr

Posted on April 21, 2015

As Larry Bird once said, “leadership is diving for a loose ball, getting the crowd involved, getting other players involved. It’s being able to take it as well as dish it out.” And, I should add, leadership is about generating new pathways in order to keep the innovative spirit going in all possible ways.

In this post I aim to give a brief explanation of how Anna Noel-Storr, the Trials Search Co-ordinator at the Cochrane Dementia and Cognitive Improvement Group, has been doing just this by leading the development of an innovative system for involving science lovers all around the world in reading and screening dementia evidence, and gathering quality scientific evidence in the process.

In every single part of the world there is a growing tendency to improve the normal ageing process related to the gradual increase in life expectancy in the developed world. Many people are trying to care for their lives more in order to achieve a healthy and independent later life. Chief among this aim is the task of preventing dementia.

The Cochrane Dementia Group’s Modifiable Risk Factors project, which involves a plan to produce several systematic reviews on the subject, has utilized new crowdsourcing techniques to speed up the process, as has been discussed here. The NIHR has funded the programme to produce four “suites” of reviews. As Anna explains, “Each suite looks at the same intervention but across three different populations: middle-age, elderly and those with mild cognitive impairment. The four interventions are: physical activity, cognition-based interventions, vitamins and minerals and dietary modifications.”

“Once approved, the full reviews can get underway, starting with the searches for potential studies.”

First of all, the process begins, as all Cochrane reviews do, with generating a protocol. “Once approved, the full reviews can get underway, starting with the searches for potential studies. Once the results have been downloaded the results need to be gone through.” This is where Anna’s team looks to the ‘crowd’ for help. This innovation provides an alternative to the existing method whereby, for the majority of reviews, “the citations would be screened by members of the author team – usually two people, screening independently of each other with a third person ready to make a final decision on citations where the screeners have not agreed.” That represents a lot of time; you know how difficult it is to agree in a group when there are different points of view and plenty of information ahead. Crowdsourcing was introduced in the Cochrane Dementia Group as an alternative way to cope with the extraordinary amount of papers to screen in order to obtain just a single meta-analysis or systematic review.

The development of a slick tool was not immediate. Building on the previous Trial Blazers study, the team was able to enhance this early alternative to in-house citation screening. For instance, the very main part of the process is getting committed people to do the crowdsourcing. “We recruit through various routes,” Anna explains, “mainly through Cochrane newsletters but also through direct contact with universities, and of course networks like S4BE are fantastic.” By these alternatives, many enthusiastic people around the world interested in taking part in different kinds of dementia-related investigations have promoted the development of scientific knowledge in this area.

“This is about trying to achieve that perfect alignment between people, process and technology.”

The development of the web application was central to the project. According to Anna-Noel Storr, this endeavour without the technology to support the task wouldn’t work. “This is about trying to achieve that perfect alignment between people, process and technology, and if one of those elements is not right, the whole endeavour is jeopardised.” In other words, the user experience has to be good. Anna doesn’t think it’s perfect, but she also believes the main things are right: sign up, navigation, use of highlighted words and phrases, compatibility on a tablet, and various in-built feedback mechanisms. Plus of course, behind the scenes, the algorithm succeeds in ensuring that records are classified the right number of times and end up in the right place.

The crowdsourcing process obviously solves a lot of problems, but causes some new and unexpected ones as well. As Anna points out, “Though it offers huge potential in terms of sustainability and in increasing our capacity to deal with ever increasing amounts of data, it takes a reasonable amount of coordination to sustain and nurture an online community. I think I probably under estimated the work involved in coordination. A crowd is a new kind of team for me.”

Establishing the best mechanisms to achieve clear and open communication, finding ways to provide feedback and support, and the right sort of rewards and incentives, has all taken some time. “We are really aware that in all these areas we could do better and we’ve had fantastic feedback from those who have taken part in this project but also in other projects too where we have used a crowd model”.

“A crowd is a new kind of team for me.”

When the full texts of the RCTs found by the crowd are obtained, then the focus will turn to extraction and quality assessment, “and hopefully, but not always, meta-analysis”. “It’s a long process and hard work but it’s important to do it, and to do it systematically” Not least because proper systematization helps guarantee the results are accurate and reliable.

Dementia is one of the most disabling diseases at the moment. We know that its prevalence is affected by lifestyle and behaviour. As a final thought, could there be any benefit for screeners in terms of their exposure to current research trends through the reading of citations, I wonder. “We don’t have any evidence on this at the moment. It would be an interesting area to consider…I hope that one of the benefits of undertaking what could be seen as quite a repetitive task is that there is an opportunity to become more aware of the research that has been done in this area, not to mention witness the variation in the reporting of research!” And how has she benefited from the project herself? “I feel I have learnt quite a lot about trial design and dementia research through having screened thousands of citations in this area.”

For further reading on this topic here on S4BE, see the list below. To keep track on the future of crowdsourcing in Cochrane, follow Anna on twitter @AnnaNoelStorr.

For the other posts in our crowdsourcing series, look here:

Many thanks to Anna Noel-Storr for her help and co-operation.

Sofía Jaramillo

Medical student. Volunteer in social activities. SEXSAR Club. Immunology's Teacher's Assistant at University of Cuenca.

More Posts - Website

Follow Me:
FacebookLinkedIn

Related Post

creative commons license
Cochrane and Crowdsourcing: an interview with Anna-Noel Storr by Sofía Jaramillo is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. Unless otherwise stated, all images used within the blog are not available for reuse or republication as they are purchased for Students 4 Best Evidence from shutterstock.com.

2 thoughts on “Cochrane and Crowdsourcing: an interview with Anna-Noel Storr

Leave a Reply

Your email address will not be published. Required fields are marked *