Crowd-sourced science

August 11, 2015 — May 17, 2022

collective knowledge
game theory
how do science
information provenance
the rather superior sort of city

Mapping the world from smartphones and chat forums and reddit threads.

1 Incoming

2 Open source science

Every Bug is Shallow if One of Your Readers is an Entomologist:

Research is difficult because reality is complex and many things are confusing or mysterious. But with enough eyeballs, all research bugs are shallow too.

Without a huge research budget and dozens of managers, you won’t be able to coordinate a ton of researchers. But the good news is, you didn’t really want to coordinate everyone anyways. You can just open the gates and let people get to work. It works fine for software!

3 Crowdsourced competitions

Another paradigm for crowdsourcing is to gamify science. That is not my jam; I feel that participants deserve a better stake than only twiddling their dopamine system and the incentives of recruiting that way are not ideal — scientists’ games will not be as fun as real game company’s highly evolved games, and the participants we can recruit for addictive behaviour are probably invested in a suboptimal manner for our goals.

4 Tools

Software frameworks for data collection: (Thanks Dan Pagendam for the tip.)

5 References

Arvan, Bright, and Heesen. 2022. Jury Theorems for Peer Review.” The British Journal for the Philosophy of Science.
Charness, and Sutter. 2012. Groups Make Better Self-Interested Decisions.” Journal of Economic Perspectives.
Ghose, and Ipeirotis. 2011. Estimating the Helpfulness and Economic Impact of Product Reviews: Mining Text and Reviewer Characteristics.” Knowledge and Data Engineering, IEEE Transactions on.
Jennett, and Cox. 2014. Eight Guidelines for Designing Virtual Citizen Science Projects.” In Second AAAI Conference on Human Computation and Crowdsourcing.
Kanhere. 2013. Participatory Sensing: Crowdsourcing Data from Mobile Smartphones in Urban Spaces.” In Distributed Computing and Internet Technology. Lecture Notes in Computer Science 7753.
King, Pan, and Roberts. 2014. Reverse-Engineering Censorship in China: Randomized Experimentation and Participant Observation.” Science.
Liu, and Miao. n.d. “A Survey of Incentives and Mechanism Design for Human Computation Systems.”
Meslec, Curseu, Meeus, et al. 2014. When None of Us Perform Better Than All of Us Together: The Role of Analogical Decision Rules in Groups.” arXiv:1406.7562 [Physics].
Naroditskiy, Jennings, Van Hentenryck, et al. 2014. Crowdsourcing Contest Dilemma.” Journal of The Royal Society Interface.
Oishi, Cebrian, Abeliuk, et al. 2014. Iterated Crowdsourcing Dilemma Game.” Scientific Reports.
Ranard, Ha, Meisel, et al. 2014. Crowdsourcing — harnessing the masses to advance health and medicine, a systematic review.” Journal of General Internal Medicine.