Realpolitik of journals
Journal rank and journal impact factor etc. Who cares? Your funders care, against your advice but whatever, they have the money, so you need to care too in order that they will keep funding you.
Latrobe explains it. Scimago Journal rank is the Google Pagerank-inspired slightly hipper journal ranking. Their search tool is probably what you want. Impact factors come from the 60s and are still around, h-Index is also a thing.
According to Latrobe, we have the following indices and (partial list of) weaknesses.
Hirsch index: The number of articles in a journal [h] that have received at least [h] citations over a citation period.
Editors can manipulate by requiring contributors to add citations from their journals
Increases with age so bias towards researchers with long publication records
Journal Impact Factor: Citations to a journal in the JCR year to items published in the previous two years, divided by the total number of citable items (articles and reviews) published in the journal in the previous two years.
Limited to journals within Web of Science
Cannot be used to compare journals across different subject categories
SCImago Journal Rank: Average number of weighted citations received in a year, by articles published in a journal in the previous 3 years.
Weaknesses are that it is “complicated” and that the numbers are small.
So I guess if you must do a journal ranking this is the least bad method?
See also academic reading workflow for reader-oriented tips.
A platform for scholarly publishing and peer review that empowers researchers with the
- Autonomy to pursue their passions,
- Authority to develop and disseminate their work, and
- Access to engage with the international community of scholars.
Millions of research papers are available for free on government and university web servers, legally uploaded by the authors themselves, with the express permission of publishers. Unpaywall automatically harvests these freely shared papers from thousands of legal institutional repositories, preprint servers, and publishers, making them all available to you as you read.
Zenodo “is an open dependable home for the long-tail of science, enabling researchers to share and preserve any research outputs in any size, any format and from any science.”
- Research. Shared. — all research outputs from across all fields of science are welcome!
- Citeable. Discoverable. — uploads gets a Digital Object Identifier (DOI) to make them easily and uniquely citeable…
- Flexible licensing — because not everything is under Creative Commons.
- Safe — your research output is stored safely for the future in same cloud infrastructure as research data from CERN’s Large Hadron Collider.
A major win is the easy DOI-linking of data and code for reproducible research. (for free)
is a free Web publishing tool that will create a complete Web presence for your scholarly conference. OCS will allow you to:
- create a conference Web site
- compose and send a call for papers
- electronically accept paper and abstract submissions
- allow paper submitters to edit their work
- post conference proceedings and papers in a searchable format
- post, if you wish, the original data sets
- register participants
- integrate post-conference online discussions
A peeriodical is a lightweight virtual journal with you as the Editor-in-chief, giving you complete freedom in setting editorial policy to select the most interesting and useful manuscripts for your readers.
is an open access online scholarly publishing platform that employs open post-publication peer review. You guessed it! We think transparency from start to finish is critical in scientific communication. …
Here’s a thing I would like to be said a little better, but think is important An Adversarial Review of “Adversarial Generation of Natural Language”: The argument is that even though it’s nice that arxiv avoids some of the problems of traditional publishing, it has some of the problems that traditional publishing tries to avoid. This was foreseeable.
Cameron Neylon runs a cottage industry producing pragmatic publishing critique from an institutional economics perspective:
we’d been talking about communities, cultures, economics, “public-making” but it was the word ‘club’ and its associated concepts, both pejorative and positive that crystalised everything. We were talking about the clubbishness of making knowledge – the term “Knowledge Clubs” emerged quickly – but also the benefits that such a club might gain in choosing to invest in wider sharing.
In the business setting this often leads incumbent publishers to a kind of spluttering defense of the value they create, while simultaneously complaining that the customer doesn’t appreciate their work. Flip the target slightly and we’d call this “missing the new market opportunity” or “failing to express the value offering clearly”. …
Lingua, … has gone from one of the most important journals in analytical linguistics to no longer being in the field, and seems well on its way to becoming irrelevant. How does a company as competent in its business strategy as Elsevier let this happen? I would argue, as I did at the time that the former editorial board of Lingua resigned to form Glossa that it was a failure to understand the assets.
The neoliberal analysis of Lingua showed an asset generating good revenues, with good analytics and a positive ROI. The capitalist analysis focussed on the fixed assets and trademarks. But it turns out these weren’t what was creating value. What was creating value was the community, built around an editorial board and the good will associated with that.
Also, see Pushing costs downstream.`
The new universal libraries
The biggest phenomenon in open access, as far as I can tell, is the massive pirate infrastructure providing open access to journals for free.
Copyright activism, Guerilla open access etc.
There is as interesting question here about mechanism design for the important business of science, which I do not myself pretend to know good answers for.
Afonso, Alexandre. 2013. “How Academia Resembles a Drug Gang.” Impact of Social Sciences. December 11, 2013. http://blogs.lse.ac.uk/impactofsocialsciences/2013/12/11/how-academia-resembles-a-drug-gang/.
Björk, Bo-Christer, and David Solomon. 2013. “The Publishing Delay in Scholarly Peer-Reviewed Journals.” Journal of Informetrics 7 (4): 914–23. https://doi.org/10.1016/j.joi.2013.09.001.
Bogich, Tiffany L, Sebastien Balleseteros, Robin Berjon, Chris Callahan, and Leon Chen. n.d. “On the Marginal Cost of Scholarly Communication.” https://research.science.ai/article/on-the-marginal-cost-of-scholarly-communication.
Himmelstein, Daniel S., Vincent Rubinetti, David R. Slochower, Dongbo Hu, Venkat S. Malladi, Casey S. Greene, and Anthony Gitter. 2019. “Open Collaborative Writing with Manubot.” Edited by Dina Schneidman-Duhovny. PLOS Computational Biology 15 (6): e1007128. https://doi.org/10.1371/journal.pcbi.1007128.
Krikorian, Gaëlle, and Amy Kapczynski. 2010. Access to Knowledge in the Age of Intellectual Property. New York; Cambridge, Mass.: Zone Books ; Distributed by the MIT Press. https://monoskop.org/images/e/e7/Krikorian_Kapczynski_eds_Access_to_Knowledge_in_the_Age_of_Intellectual_Property_2010.pdf.
Mensh, Brett, and Konrad Kording. 2016. “Ten Simple Rules for Structuring Papers.” Preprint. Scientific Communication and Education. https://doi.org/10.1101/088278.
Potts, Jason, John Hartley, Lucy Montgomery, Cameron Neylon, and Ellie Rennie. 2016. “A Journal Is a Club: A New Economic Model for Scholarly Publishing.” SSRN Scholarly Paper ID 2763975. Rochester, NY: Social Science Research Network. http://papers.ssrn.com/abstract=2763975.
Sand-Jensen, Kaj. 2007. “How to Write Consistently Boring Scientific Literature.” Oikos 116: 723–27.
Schimmer, Ralf, Geschuhn, Kai Karin, and Vogler, Andreas. 2015. “Disrupting the Subscription Journals’ Business Model for the Necessary Large-Scale Transformation to Open Access.” https://doi.org/10.17617/1.3.
Van Noorden, Richard. 2013. “Open Access: The True Cost of Science Publishing.” Nature 495 (7442): 426–29. https://doi.org/10.1038/495426a.