How do brains work?

I mean, how do brains work at the level slightly higher than a synapse, but much lower than, e.g. psychology. βHow is thought done?β etc.

Notes pertaining to *large, artificial* networks are filed under
artificial neural networks.
The messy, biological end of the stick is here.
Since brains seem to be the seat of the most flashy and important bit of the
computing taking place in our bodies, we
understandably want to know how they works, in order to

- fix Alzheimers disease
- steal cool learning tricks
- endow the children of elites with superhuman mental prowess to cement their places as Γbermenschen fit to rule the thousand year Reich
- β¦or whatever.

Real brains are different to the βneuron-inspiredβ computation of the simulacrum in many ways, not just the usual difference between model and reality. The similitude between βneural networksβ and neurons is intentionally weak for reasons of convenience.

For one example, most simulated neural networks are based on a continuous activation potential and discrete time, unlike spiking biological ones which are driven by discrete events in continuous time.

Also real brains support heterogeneous types of neuron, have messier layer organisation, use less power, donβt have well-defined backpropagation (or not in the same way), and many other things that I as a non-specialist do not know.

To learn more about:

- Just saw a talk by Dan CireΘan in which he mentioned the importance of βFoveationβ - blurring the edge of an image when training classifiers on it, thus encouraging us to ignore stuff in order to learn better. What is, rigorously speaking, happening there? Nice actual crossover between biological neural nets and fake ones.
- Algorithmic statistics of neurons sounds interesting.
- Modelling mind as machine learning.

## Fun data

## How computationally complex is a neuron?

Empirically quantifying computation is hard, but people try to do it all the time for brains. Classics try to estimate structure in neural spike trains, (Crumiller et al. 2011; Haslinger, Klinkner, and Shalizi 2010; Nemenman, Bialek, and de Ruyter van Steveninck 2004) often by empirical entropy estimates.

If we are prepared to accept βsize of a neural network needed to approximate Xβ as an estimate of the complexity of X, then there are some interesting results:
Allison Whitten, How Computationally Complex Is a Single Neuron?
(Beniaguev, Segev, and London 2021).
OTOH, finding the *smallest* neural network that can approximate something is itself computationally hard and not in general even easy to check.

## Pretty pictures of neurons

The names I am looking for here for beutiful hand drawn early neuron diagrams are Camillo Golgi and Santiago RamΓ³n y Cajal, especially the latter.

## References

*Neural Computation*16 (4): 717β36.

*Journal of Neuroscience Methods*105 (1): 25β37.

*Neuron*109 (17): 2727β2739.e3.

*Trends in Cognitive Sciences*15 (3): 113β21.

*Neural Computation*0 (0): 080804143617793β28.

*PLoS Comput Biol*8 (6): e1002561.

*Nature Reviews Neuroscience*6 (10): 755β65.

*PLoS Comp. Biol.*10: e1003963.

*Journal of Psychopharmacology*31 (9): 1091β1120.

*Frontiers in Cellular Neuroscience*13.

*Frontiers in Neuroscience*5: 90.

*Neural Computation*16 (5): 971β98.

*Cognitive Science*14: 179β211.

*Cognition*48: 71β99.

*Annals of the New York Academy of Sciences*1016: 153β70.

*Journal of The Royal Society Interface*4 (12): 41.

*The Annals of Statistics*27 (4): 1119β41.

*Current Biology*16 (5): R147β51.

*Neural Computation*22 (1): 121β57.

*Neural Computation*22 (10): 2477β2506.

*Physical Review E*80 (5): 051902.

*PLoS Comput Biol*7 (3): β1001108.

*PLOS Computational Biology*13 (1): e1005268.

*Annual Review of Statistics and Its Application*5 (1): 183β214.

*PLoS ONE*6 (9): β24516.

*Cell*180 (3): 552β567.e25.

*arXiv Preprint arXiv:1508.06818*.

*BMC Neuroscience*16 (Suppl 1): P196.

*Advances in Neural Information Processing Systems*19: 801.

*Science*346 (6209): 551β52.

*Physical Review E*69 (5): 056111.

*Nature*381 (6583): 607β9.

*Current Opinion in Neurobiology*14 (4): 481β87.

*Neural Computation*29 (8): 2021β29.

*Parallel Distributed Processing: Explorations in the Microstructure of Cognition*. 1986. MIT Press.

*Scientific Reports*9 (1): 1889.

*Neuroscience*61 (4): 991β1006.

*PLoS ONE*7 (9): β44436.

*arXiv:1610.06551 [Stat]*, October.

*Annual Review of Neuroscience*24 (1): 1193β1216.

*Neural Computation*15 (5): 965β91.

*Advances in Neural Information Processing Systems*, 1289β96.

*Nature*439 (7079): 978β82.

*Neural Computation*17 (1): 19β45.

*Organic and functional nervous diseases; a text-book of neurology*. New York, Philadelphia, Lea & Febiger.

*Proceedings of the National Academy of Sciences*111 (51): 18183β88.

*Phys. Rev.Β Lett.*80 (1): 197β200.

*Neural Computation*27 (1): 1β31.

*PLoS Computational Biology*11 (3).

*Frontiers in Human Neuroscience*8.

## No comments yet. Why not leave one?