I would like to consider the system of peer review, networking, conferencing, publishing and acclaim and see how closely it approximates an ideal system for uncovering truth, and further, imagine how we could make a better system. But I do not do that right now, I just collect some provocative links to that theme, in hope of time for more thought later.
pubpeer (who are behind peeriodicals) produces a peer-review overlay for web browsers to spread their commentary and peer critique more widely. The site is itself brusquely confusing, but well blogged; you’ll get the idea. They are not afraid of invective, and I thought they looked more amateurish than effective. But I was wrong; they are quite selective and they seem to be near the best elective peer review today.🎶 This system been implicated in topical high-profile retractions (e.g. 1 2.
Related question: How do we discover research to peer review?
Robin Hanson, of course, has much to say on potentially better mechanism design for scientific discovery. I have qualms about his implied cash rewards system crowding out reputational awards; I think there is something to be said for that particular economy; but yes why not try it out?
Here’s a thing I would like to be said a little better, but think is important An Adversarial Review of “Adversarial Generation of Natural Language”: The argument is that even though it’s nice that arxiv avoids some of the problems of traditional publishing, it has some of the problems that traditional publishing tries to avoid. This was foreseeable.
Cameron Neylon runs a cottage industry producing pragmatic publishing critique from an institutional economics perspective:
we’d been talking about communities, cultures, economics, “public-making” but it was the word ‘club’ and its associated concepts, both pejorative and positive that crystalised everything. We were talking about the clubbishness of making knowledge — the term “Knowledge Clubs” emerged quickly — but also the benefits that such a club might gain in choosing to invest in wider sharing.
In the business setting this often leads incumbent publishers to a kind of spluttering defense of the value they create, while simultaneously complaining that the customer doesn’t appreciate their work. Flip the target slightly and we’d call this “missing the new market opportunity” or “failing to express the value offering clearly”. […]
Lingua, […] has gone from one of the most important journals in analytical linguistics to no longer being in the field, and seems well on its way to becoming irrelevant. How does a company as competent in its business strategy as Elsevier let this happen? I would argue, as I did at the time that the former editorial board of Lingua resigned to form Glossa that it was a failure to understand the assets.
The neoliberal analysis of Lingua showed an asset generating good revenues, with good analytics and a positive ROI. The capitalist analysis focussed on the fixed assets and trademarks. But it turns out these weren’t what was creating value. What was creating value was the community, built around an editorial board and the good will associated with that.
Also, see Pushing costs downstream.`
Here is an interesting edge case in peer review and scientific reputation. Adam Becker, Junk Science or the Real Thing? ‘Inference’ Publishes Both. As far as I’m concerned, publishing crap in itself is not a catastrophic. A process that fails to discourage crap would be bad.
What is academia for?
This essay traces the history of refereeing at specialist scientific journals and at funding bodies and shows that it was only in the late twentieth century that peer review came to be seen as a process central to scientific practice. Throughout the nineteenth century and into much of the twentieth, external referee reports were considered an optional part of journal editing or grant making. The idea that refereeing is a requirement for scientific legitimacy seems to have arisen first in the Cold War United States. In the 1970s, in the wake of a series of attacks on scientific funding, American scientists faced a dilemma: there was increasing pressure for science to be accountable to those who funded it, but scientists wanted to ensure their continuing influence over fundingonline decisions. Scientists and their supporters cast expert refereeing—or “peer review,” as it was increasingly called—as the crucial process that ensured the credibility of science as a whole. Taking funding decisions out of expert hands, they argued, would be a corruption of science itself. This public elevation of peer review both reinforced and spread the belief that only peer-reviewed science was scientifically legitimate.
Thomas Basbøll says
It is commonplace today to talk about “knowledge production” and the university as a site of innovation. But the institution was never designed to “produce” something nor even to be especially innovative. Its function was to conserve what we know. It just happens to be in the nature of knowledge that it cannot be conserved if it does not grow.
So why did Wolfram announce his ideas this way? Why not go the traditional route? “I don’t really believe in anonymous peer review,” he says. “I think it’s corrupt. It’s all a giant story of somewhat corrupt gaming, I would say. I think it’s sort of inevitable that happens with these very large systems. It’s a pity.”
So what are Wolfram’s goals? He says he wants the attention and feedback of the physics community. But his unconventional approach—soliciting public comments on an exceedingly long paper—almost ensures it shall remain obscure. Wolfram says he wants physicists’ respect. The ones consulted for this story said gaining it would require him to recognize and engage with the prior work of others in the scientific community.
And when provided with some of the responses from other physicists regarding his work, Wolfram is singularly unenthused. “I’m disappointed by the naivete of the questions that you’re communicating,” he grumbles. “I deserve better.”
Afonso, Alexandre. 2013. “How Academia Resembles a Drug Gang.” Impact of Social Sciences. December 11, 2013. http://blogs.lse.ac.uk/impactofsocialsciences/2013/12/11/how-academia-resembles-a-drug-gang/.
Agassi, Joseph. 1974. “The Logic of Scientific Inquiry.” Synthese 26: 498–514. https://doi.org/10.1007/BF00883107.
a Literal Banana. 2020. “Extended Sniff Test,” 7.
Alon, Uri. 2009. “How to Choose a Good Scientific Problem.” Molecular Cell 35 (6): 726–28. https://doi.org/10.1016/j.molcel.2009.09.013.
Arbesman, Samuel, and Nicholas A Christakis. 2011. “Eurekometrics: Analyzing the Nature of Discovery.” PLoS Comput Biol 7 (6): –1002072. https://doi.org/10.1371/journal.pcbi.1002072.
Arbilly, Michal, and Kevin N. Laland. 2017. “The Magnitude of Innovation and Its Evolution in Social Animals.” Proceedings of the Royal Society B: Biological Sciences 284 (1848). https://doi.org/10.1098/rspb.2016.2385.
Azoulay, Pierre, Christian Fons-Rosen, and Joshua S. Graff Zivin. 2015. “Does Science Advance One Funeral at a Time?” Working Paper 21788. National Bureau of Economic Research. https://doi.org/10.3386/w21788.
Baldwin, Melinda. 2018. “Scientific Autonomy, Public Accountability, and the Rise of ‘Peer Review’ in the Cold War United States.” Isis 109 (3): 538–58. https://doi.org/10.1086/700070.
Björk, Bo-Christer, and David Solomon. 2013. “The Publishing Delay in Scholarly Peer-Reviewed Journals.” Journal of Informetrics 7 (4): 914–23. https://doi.org/10.1016/j.joi.2013.09.001.
Bogich, Tiffany L, Sebastien Balleseteros, Robin Berjon, Chris Callahan, and Leon Chen. n.d. “On the Marginal Cost of Scholarly Communication.” https://research.science.ai/article/on-the-marginal-cost-of-scholarly-communication.
Coscia, Michele, and Luca Rossi. 2020. “Distortions of Political Bias in Crowdsourced Misinformation Flagging.” Journal of the Royal Society Interface 17 (167): 20200020. https://doi.org/10.1098/rsif.2020.0020.
Couzin-Frankel, Jennifer. 2015. “PubPeer Co-Founder Reveals Identity—and New Plans.” Science 349 (6252): 1036–6. https://doi.org/10.1126/science.349.6252.1036.
Gelman, Andrew. 2011. “Experimental Reasoning in Social Science.” In Field Experiments and Their Critics.
Gharbi, Musa al-. 2020. “Race and the Race for the White House: On Social Research in the Age of Trump.” Preprint. SocArXiv. https://doi.org/10.31235/osf.io/n8bkh.
“Go Forth and Replicate!” 2016. Nature News 536 (7617): 373. https://doi.org/10.1038/536373a.
Greenberg, Steven A. 2009. “How Citation Distortions Create Unfounded Authority: Analysis of a Citation Network.” BMJ 339 (July): b2680. https://doi.org/10.1136/bmj.b2680.
Hallsson, Bjørn G., and Klemens Kappel. 2020. “Disagreement and the Division of Epistemic Labor.” Synthese 197 (7): 2823–47. https://doi.org/10.1007/s11229-018-1788-6.
Heesen, Remco, and Liam Kofi Bright. n.d. “Is Peer Review a Good Idea?” The British Journal for the Philosophy of Science. Accessed July 26, 2020. https://doi.org/10.1093/bjps/axz029.
Hodges, James S. 2019. “Statistical Methods Research Done as Science Rather Than Mathematics.” May 20, 2019. http://arxiv.org/abs/1905.08381.
Ioannidis, John P. 2005. “Why Most Published Research Findings Are False.” PLoS Medicine 2 (8): –124. https://doi.org/10.1371/journal.pmed.0020124.
Jiménez, Ángel V., and Alex Mesoudi. 2019. “Prestige-Biased Social Learning: Current Evidence and Outstanding Questions.” Palgrave Communications 5 (1): 1–12. https://doi.org/10.1057/s41599-019-0228-7.
Kirman, Alan. 1992. “Whom or What Does the Representative Individual Represent?” The Journal of Economic Perspectives 6 (2): –117–36. http://www.jstor.org/stable/2138411.
———. 2010. “Learning in Agent Based Models.”
Krikorian, Gaëlle, and Amy Kapczynski. 2010. Access to Knowledge in the Age of Intellectual Property. New York; Cambridge, Mass.: Zone Books ; Distributed by the MIT Press. https://monoskop.org/images/e/e7/Krikorian_Kapczynski_eds_Access_to_Knowledge_in_the_Age_of_Intellectual_Property_2010.pdf.
Lakatos, Imre. 1980. The Methodology of Scientific Research Programmes: Volume 1 : Philosophical Papers. Cambridge University Press.
Laland, Kevin N. 2004. “Social Learning Strategies.” Animal Learning & Behavior 32 (1): 4–14. https://doi.org/10.3758/BF03196002.
McCook, Author Alison. 2017. “Meet PubPeer 2.0: New Version of Post-Publication Peer Review Site Launches Today.” Retraction Watch. June 15, 2017. https://retractionwatch.com/2017/06/15/meet-pubpeer-2-0-new-version-post-publication-peer-review-site-launches-today/.
Medawar, Peter B. 1969. Induction and Intuition in Scientific Thought. American Philosophical Society Philadelphia.
———. 1982. Pluto’s Republic. Oxford University Press.
———. 1984. The Limits of Science. Harper & Row.
Merali, Zeeya. 2010. “Computational Science: Error.” Nature 467: 775–77. https://doi.org/10.1038/467775a.
“Nature Editors: All Hat and No Cattle.” 2016, December. https://blog.pubpeer.com/publications/AE11BE44CF3C40A558F3B453BF53C7#14.
Nguyen, C. Thi. 2020. “Cognitive Islands and Runaway Echo Chambers: Problems for Epistemic Dependence on Experts.” Synthese 197 (7): 2803–21. https://doi.org/10.1007/s11229-018-1692-0.
“Post-Publication Criticism Is Crucial, but Should Be Constructive.” 2016. Nature News 540 (7631): 7. https://doi.org/10.1038/540007b.
Post, Daniel J. van der, Mathias Franz, and Kevin N. Laland. 2016. “Skill Learning and the Evolution of Social Learning Mechanisms.” BMC Evolutionary Biology 16 (1): 166. https://doi.org/10.1186/s12862-016-0742-9.
Potts, Jason, John Hartley, Lucy Montgomery, Cameron Neylon, and Ellie Rennie. 2016. “A Journal Is a Club: A New Economic Model for Scholarly Publishing.” SSRN Scholarly Paper ID 2763975. Rochester, NY: Social Science Research Network. http://papers.ssrn.com/abstract=2763975.
Rekdal, Ole Bjørn. 2014. “Academic Urban Legends.” Social Studies of Science 44 (4): 638–54. https://doi.org/10.1177/0306312714535679.
Ridley, J, N Kolm, R P Freckelton, and M J G Gage. 2007. “An Unexpected Influence of Widely Used Significance Thresholds on the Distribution of Reported P-Values.” Journal of Evolutionary Biology 20: 1082–9. https://doi.org/10.1111/j.1420-9101.2006.01291.x.
Ritchie, Stuart. 2020. Science Fictions: How Fraud, Bias, Negligence, and Hype Undermine the Search for Truth. First edition. New York: Metropolitan Books ; Henry Holt and Company.
Rzhetsky, Andrey, Jacob G. Foster, Ian T. Foster, and James A. Evans. 2015. “Choosing Experiments to Accelerate Collective Discovery.” Proceedings of the National Academy of Sciences 112 (47): 14569–74. https://doi.org/10.1073/pnas.1509757112.
Schimmer, Ralf, Geschuhn, Kai Karin, and Vogler, Andreas. 2015. “Disrupting the Subscription Journals’ Business Model for the Necessary Large-Scale Transformation to Open Access.” https://doi.org/10.17617/1.3.
Sekara, Vedran, Pierre Deville, Sebastian E. Ahnert, Albert-László Barabási, Roberta Sinatra, and Sune Lehmann. 2018. “The Chaperone Effect in Scientific Publishing.” Proceedings of the National Academy of Sciences 115 (50): 12603–7. https://doi.org/10.1073/pnas.1800471115.
Sen, Amartya K. 1977. “Rational Fools: A Critique of the Behavioral Foundations of Economic Theory.” Philosophy and Public Affairs 6: 317–44. https://doi.org/10.2307/2264946.
Spranzi, Marta. 2004. “Galileo and the Mountains of the Moon: Analogical Reasoning, Models and Metaphors in Scientific Discovery.” Journal of Cognition and Culture 4 (3): 451–83. https://doi.org/10.1163/1568537042484904.
Stove, David Charles. 1982. Popper and After: Four Modern Irrationalists. Pergamon.
Suppes, Patrick. 2002. Representation and Invariance of Scientific Structures. CSLI Publications.
Thagard, Paul. 1993. “Societies of Minds: Science as Distributed Computing.” Studies in History and Philosophy of Modern Physics 24: 49.
———. 1994. “Mind, Society, and the Growth of Knowledge.” Philosophy of Science 61.
———. 1997. “Collaborative Knowledge.” Noûs 31 (2): 242–61.
———. 2005. “How to Be a Successful Scientist.” Scientific and Technological Thinking, 159–71.
———. 2007. “Coherence, Truth, and the Development of Scientific Knowledge.” Philosophy of Science 74: 28–47. https://doi.org/10.1086/520941.
Thagard, Paul, and Abninder Litt. 2008. “Models of Scientific Explanation.” In The Cambridge Handbook of Computational Psychology. Cambridge: Cambridge University Press.
Thagard, Paul, and Jing Zhu. 2003. “Acupuncture, Incommensurability, and Conceptual Change.” Intentional Conceptual Change, 79–102.
Thurner, Stefan, and Rudolf Hanel. 2010. “Peer-Review in a World with Rational Scientists: Toward Selection of the Average.”
Van Noorden, Richard. 2013. “Open Access: The True Cost of Science Publishing.” Nature 495 (7442): 426–29. https://doi.org/10.1038/495426a.
Vazire, Simine. 2017. “Our Obsession with Eminence Warps Research.” Nature News 547 (7661): 7. https://doi.org/10.1038/547007a.
Wible, James R. 1998. Economics of Science. Routledge.
Yarkoni, Tal. 2019. “The Generalizability Crisis.” Preprint. PsyArXiv. https://doi.org/10.31234/osf.io/jqw35.