Soft methodology of science

January 13, 2012 — September 6, 2022

adaptive
collective knowledge
economics
faster pussycat
how do science
incentive mechanisms
institutions
mind
networks
sociology
Figure 1: The course of science

In which I collect tips from esteemed and eminent minds about how to go about pro-actively discovering stuff. More meta-tips than specific agendas for discovery.

Related: Progress studies, citizen science.

1 Habits of highly effective scientists

Classic:

  • Curiosity

According to SLIME MOLD TIME MOLD

  • Stupidity
  • Arrogance
  • Laziness
  • Carefreeness
  • Beauty
  • Rebellion
  • Humor
Figure 2: Sarah Perry

Hardtowrite

In science, the process is just as important as the discoveries. Improving our scientific processes will speed up our rate of discovery. Feyerabend claims contemporary research has over-indexed on processes such as the scientific method and this rigidness has restrained innovation. The crux of his book Against Method (Feyerabend and Hacking 2010) is that paradigm shifts in science stem from epistemological anarchism. Epistemology refers to the formulation of beliefs. This anarchy, to any Thomas Kuhn fans, is what is necessary to achieve Kuhn’s Stage 4 phase of science, the ‘evolutionary phase’ in which new paradigms are created. In recent decades we have placed too much importance on science being consistent, while forgetting that paradigm shifts often come from those who refute mainstream assumptions. In other words, the geniuses who generated scientific paradigm shifts were anarchists to their contemporaries.

Jan Hendrik Kirchner, Via productiva

Be wary, be very wary indeed, of engaging with a Hydra problem. You might be able to identify a Hydra problem by keeping track of the branching factor: how many subproblems does each step of your derivation introduce? If this continues to be larger than 1, your approach is probably incorrect or at least infeasible. This might appear obvious, but history is full of famous examples of researchers getting stuck on epicycles. Researchers produced “pseudo-answers” to questions, but they opened more questions than closed. I observe the pattern in my colleagues, students, and myself.

2 How precious is my idea?

Figure 3

My understanding about which parts of research are hard and which things are valuable has changed.

Let us suppose that I was worried about being “scooped,” beaten to publication, which is a thing that people in science tend to worry about a little bit but not so much as I imagined before I joined in for myself. How valuable is an insight? It depends. An insight like “this might be worth trying who knows” is not valueless, but the opportunity cost of trying it is possibly months of specialist time or years of grad student time. Let us call such intuitions. Even if the certainty is high, the exercise cost is also high, so it is not worth keeping this kind of thing quiet; rather, you want to share this kind of thing as widely as possible so the work and risk can be shared between multiple researchers. Having good intuitions is useful but if anything you want to cultivate the

If my idea is “I can definitely solve this particular problem in a particular way,” the value of this intelligence is higher. Let us call these solutions. Often, simply knowing a solution exists in a particular domain can lead someone else rapidly to a solution because it constrains the search for solutions enough to allow for other people to find it quickly. This is because most of mathematics, at least for me, is trying bone-headed things that turn out to be silly, and merely narrowing the search space can make things drastically easier. So if you were in a competitive, idea stealing environment, existence of solutions tends to be valuable intelligence. It is not the whole way to a paper which will get you credit for being clever (and I personally value a paper which has more than one great idea in it) but a good solution is an important chunk.

On the other hand, academia can often operate like a cooperative endeavour, or even a competitively cooperative potlatch and giving away valuable things is a way to signal status and attract collaborators and funding.

Overcoming Bias : The InDirect-Check Sweet Spot

Tony Kulesa, in Tyler Cowen is the best curator of talent in the world gives a fannish profile of Tyler Cowen including ideas about how you would become expert at identifying underexploited talent. Part of that process, interestingly, is not having a committee process, and being shamelessly personal-judgment-driven. Insert speculation about wisdom of crowds versus madness of committees. Also, equity concerns.

3 Sidling up to the truth

As a researcher motivated by big picture ideas (How can we survive on planet earth without being consumed in battles over dwindling resources and environmental crises?) as much as aesthetic ideas, I am sometimes considered damaged goods. A fine scientist, many claim, is safely myopic, the job of discovery being detailed piecework.

On one hand, the various research initiatives that pay my way are tied to various real world goals (“Predict financial crises!”, “Tell us the future of the climate!”). On the other, researchers involved tell me that it is useless to try and solve these large issues wholesale, but that one must identify small retail questions that one can hope to make progress on. On the other hand, they’ve just agreed to take a lot of money to solve big problems. In this school, then, the presumed logic is that one takes a large research grant to strike a light on lots of small problems that lie in the penumbra of the large issue, in the hope that one is flares up to illuminate the shade. Or burns the lot to the ground. The example given by the Oxonian scholar who most recently expounded this to me was Paul David and the path dependence of the QWERTY keyboard. Deep issue of the contingency of the world, seen through the tiny window opened by substandard keyboard design.

Truth, in these formulations, is a cat: don’t look at it directly or it will perversely slope off to rub against someone else’s leg. Your acceptance is all in the sidling-up, the feigning disinterest, and waiting for truth to come up and show you its belly. I’m not sure I’m am persuaded by this. It’s the kind of science that would be expounded in an education film directed by Alejandro Jodorowsky.

On the other hand, I’m not sure that I buy the grant-maker’s side of this story either, at least the story that grant-makers seem to expound in Australia, which is that they give out money to go out and find something out. There are productivity outcomes on the application form where you fill out the goals that your research will fulfill; This rules out much of the research done, by restricting you largely to marginally refining a known-good idea rather than trying something new. I romantically imagine that in much research, you would not know what you were discovering in advance.

The compromise is that we meet in the middle and swap platitudes. We will “improve our understanding of X”, we will “find strategies to better manage Y”. We certainly don’t mention that we might spend a while pondering keyboard layouts when the folks ask us to work out how to manage a complex non-linear economy.

4 Disruption by field outsiders

Figure 4

Are fields plagued by hyperselection? Can we find fields that are ripe for disruption by outsiders with radical out-of-the-box ideas? Do we need left-field eccentrics to roam about asking if the emperor has clothes?

Is it just stirring the pot? How many, to choose an example, physicists, can get published by ignoring everyone else’s advances?

How do you know that your left field idea is a radically simple left-field idea that causes the entire field to advance? And how do you know that it is not the crazed ramblings of someone missing the advances of the last several decades, an asylum inmate wandering out of the walled disciplinary asylum in a dressing gown, railing against the Vietnam War?

5 Optimising versus satisficing

6 Interaction effects

See interaction effects.

7 Consultant as co-genius

Edward Kmett seems popular.

8 Incoming

  • Alexey Guzey quotes an anonymous twitter user:

    The bare truth of science:

    • Nobody believes a computational model except the person who built it.
    • Everybody believes an experimental measurement except the person who made it.

    — … June 28, 2018

9 References

Alon. 2009. How to Choose a Good Scientific Problem.” Molecular Cell.
Arbesman, and Christakis. 2011. Eurekometrics: Analyzing the Nature of Discovery.” PLoS Comput Biol.
Azoulay, Fons-Rosen, and Zivin. 2015. Does Science Advance One Funeral at a Time? Working Paper 21788.
Devezer, Nardin, Baumgaertner, et al. 2019. Scientific Discovery in a Model-Centric Framework: Reproducibility, Innovation, and Epistemic Diversity.” PLOS ONE.
Duchin. 2004. “The Sexual Politics of Genius.”
Dyba, Kitchenham, and Jorgensen. 2005. “Evidence-Based Software Engineering for Practitioners.” IEEE Software.
Feyerabend, and Hacking. 2010. Against Method.
Montagnes, Montagnes, and Yang. 2022. Finding Your Scientific Story by Writing Backwards.” Marine Life Science & Technology.
Newman. 2009. The First-Mover Advantage in Scientific Publication.” EPL (Europhysics Letters).
Nissen, Magidson, Gross, et al. 2016. Publication Bias and the Canonization of False Facts.” arXiv:1609.00494 [Physics, Stat].
Rekdal. 2014. Academic Urban Legends.” Social Studies of Science.
Schwartz. 2008. The Importance of Stupidity in Scientific Research.” Journal of Cell Science.
Smaldino, and O’Connor. 2020. Interdisciplinarity Can Aid the Spread of Better Methods Between Scientific Communities.”
Thagard. 1993. “Societies of Minds: Science as Distributed Computing.” Studies in History and Philosophy of Modern Physics.
———. 1997. “Collaborative Knowledge.” Noûs.
———. 2005. “How to Be a Successful Scientist.” Scientific and Technological Thinking.
Weng, Flammini, Vespignani, et al. 2012. Competition Among Memes in a World with Limited Attention.” Scientific Reports.