Earthquake researchers have a problem. So do scientists trying to investigate the spread of deadly malaria. Whilst conservationists trying to get a handle on the state of illegal logging may have it worst of all.
What connects all of these is that when it comes to cracking some of their field’s biggest issues, traditional science methods are not fully up to the task.
The earthquake researchers would like to have monitoring systems that fully span high-risk areas, but having enough research-grade sensors that cover hundreds of miles of fault lines would cost them millions and millions of dollars. Swiss malaria researchers need to run enormous numbers of calculations to simulate the spread of malaria worldwide; valuable information for governments deciding where best to spend limited resources on life-saving interventions. But the computing power to run it is too expensive. Whilst in the deep, lush forests of the Congo Basin having enough well-trained people to monitor illegal tree felling and poaching is almost impossible.
But science is changing. In the age of the internet it is waking up the idea of people power: the combined forces of thousands of ordinary connected volunteers can help collect or crunch overwhelming masses of data.
“There’s so many people that can take part in this way, I think crowd-sourcing could almost be more important than the development of the Web,” says Ben Segal, who has worked on many volunteer computing projects at the at the European Particle Physics Laboratory, Cern – and who mentored, among others a young Tim Berners-Lee, who would go on to invent the World Wide Web.
“Citizen science” is not a new concept. So-called “volunteer computing” projects have expanded rapidly since the launch of Seti@Home in 1999, a program that still uses the power of millions of ordinary computers in screen-saver mode to help search for signs of intelligent life in the universe. Tapping into aspects of computer processing power, such as recharging modes, and making use of previously wasted “cycles”, desktops or laptops from people scattered worldwide can band together to mimic the number-crunching power of a supercomputer.
By the end of the last decade, several projects were using volunteer computing power for solving complex problems, ranging from cataloguing stars in the distant corners of the universe with Galaxy Zoo to predicting the complex three dimensional structures of protein structures with FoldIt. There is even a site (scistarter.com) devoted to the growing popularity of citizen science, where people can discover, take part in and fund research projects.
But those at the forefront of the field say that citizen science is now beginning to enter a new era. What has changed is a growing sense that participants can actively take part in projects, rather than passively allowing their idle computer to do the grunt work. “Their feeling is that science is too important to be left to scientists alone,” says Francois Grey from the Citizen Cyberscience Centre, a collaboration set up in 2009 between CERN, the University of Geneva, and the United Nations, with seed money from the Shuttleworth Foundation.
Tools for the trade
Grey’s Citizen Cyberscience Centre is one of the main operations pushing citizen science into unexplored territories. One reason Grey says this is becoming increasingly possible is that the technology barrier is dropping, so more and more sophisticated hardware can be placed in citizen’s hands.
One project the CCC supports – the “Quake Catcher Network”, or QCN as it is known – epitomizes the trend towards ever-smaller, more nimble devices, based upon the latest chips. A customized external motion-detecting device with a USB plug turns peoples’ ordinary desktops into automated earthquake detectors. Connect computers via the internet to a centralized system, or server, and you now have a wide-ranging system that maps an earthquake’s aftermath.