Biological basis of language



Neurology of langauge

Dan Stowell summarises a neural basis for recursive syntax:

For decades, Noam Chomsky and colleagues have famously been developing and advocating a “minimalist” (Bolhuis et al. 2014) idea about the machinery our brain uses to process language. […] They propose that not much machinery is needed, and one of the key components is a “merge” operation that the brain uses in composing and decomposing grammatical structures.

Then yesterday I was reading this introduction to embeddings in artificial neural network and NLP, and I read the following:

“Models like [this] are powerful, but they have an unfortunate limitation: they can only have a fixed number of inputs. We can overcome this by adding an association module, A, which will take two word or phrase representations and merge them.” (Bottou 2011)

Analogy with artificial neural networks

TBD

Evolution of language

TBD

Computational plausibility

See syntax.

Meaning

See semantics.

Scrapbook

“They’re using phrase-structure grammar, long-distance dependencies. FLN recursion, at least four levels deep and I see no reason why it won’t go deeper with continued contact. […] It doesn’t have a clue what I’m saying.”

“What?”

“It doesn’t even have a clue what it’s saying back,” she added.

Peter Watts, Blindsight

Sam Kriss calls the spamularity the language of god:

What is machine language? Firstly, machine language is vampiric, shamanic, xenophagic, mocking. It’s a changeling. Often it tries to imitate human discourse; the machine wants you to think that it’s human. This is the first level of deception. Often this isn’t enough: machines will use various methods to take over other text-producing systems, so that without your knowledge you end up advertising weight loss pills to all your old school friends. First axiom: all language has the potential to become machine language. To become infected. 10 Award-Winning GIFs That Will Leave You Wanting More. I Could Watch #4 For Days. This is the second level of deception. In the third level of deception, the machine convinces itself that it has a physically extended body, that it has an independent mind, that it really wants to produce the text it generates. This might happen very soon. It might have already happened, somewhere on a dusty plain in western Africa, somewhere that never really existed, tens of thousands of years ago.

References

Angluin, Dana. 1987. Learning Regular Sets from Queries and Counterexamples.” Information and Computation 75 (2): 87–106.
———. 1988. Identifying Languages from Stochastic Examples.” No. YALEU/DCS/RR-614.
Berwick, Robert C., Kazuo Okanoya, Gabriel J.L. Beckers, and Johan J. Bolhuis. 2011. Songs to Syntax: The Linguistics of Birdsong.” Trends in Cognitive Sciences 15 (3): 113–21.
Blazek, Paul J., and Milo M. Lin. 2020. A Neural Network Model of Perception and Reasoning.” arXiv:2002.11319 [Cs, q-Bio], February.
Bolhuis, Johan J., Ian Tattersall, Noam Chomsky, and Robert C. Berwick. 2014. How Could Language Have Evolved? PLoS Biol 12 (8): e1001934.
Bottou, Leon. 2011. From Machine Learning to Machine Reasoning.” arXiv:1102.1808 [Cs], February.
Cancho, Ramon Ferrer i, and Ricard V. Solé. 2003. Least Effort and the Origins of Scaling in Human Language.” Proceedings of the National Academy of Sciences 100 (3): 788–91.
Christiansen, Morten H, and Nick Chater. 2008. Language as Shaped by the Brain.” Behavioral and Brain Sciences 31: 489–509.
Elman, Jeffrey L. 1991. Distributed Representations, Simple Recurrent Networks, and Grammatical Structure.” Machine Learning 7: 195–225.
———. 1993. Learning and Development in Neural Networks: The Importance of Starting Small.” Cognition 48: 71–99.
———. 1995. “Language as a Dynamical System,” 195.
Fitch, W. Tecumseh. 2006. The Biology and Evolution of Music: A Comparative Perspective.” Cognition 100 (1): 173–215.
Hart, Dr Carl L. 2021. Drug Use for Grown-Ups: Chasing Liberty in the Land of Fear. First Edition. New York: Penguin Press.
Hauser, Marc, and Jeffrey Watumull. 2016. The Universal Generative Faculty: The Source of Our Expressive Power in Language, Mathematics, Morality, and Music.” Journal of Neurolinguistics 43 (November): 78–94.
Kirby, Simon. 1998. Learning, Bottlenecks and the Evolution of Recursive Syntax.” In.
———. 2003. Language Evolution. Oxford University Press, USA.
Marcus, Gary, Adam Marblestone, and Thomas Dean. 2014. The atoms of neural computation.” Science 346 (6209): 551–52.
Mcclelland, James L, Matthew M Botvinick, David C Noelle, David C Plaut, Timothy T Rogers, Mark S Seidenberg, and Linda B Smith. 2010. Letting Structure Emerge: Connectionist and Dynamical Systems Approaches to Cognition.” Trends in Cognitive Sciences 14 (8): 348–56.
Nowak, Martin A, and DAvid C Krakauer. 1999. “The Evolution of Language.” Proceedings of the National Academy of Sciences of the United States of America 96 (14): 8028.
Petersson, Karl-Magnus, Vasiliki Folia, and Peter Hagoort. 2012. What Artificial Grammar Learning Reveals about the Neurobiology of Syntax.” Brain and Language, The Neurobiology of Syntax, 120 (2): 83–95.
Plotkin, Joshua B, and Martin A Nowak. 2000. Language Evolution and Information Theory.” Journal of Theoretical Biology 205: 147–59.
Pylkkänen, Liina. 2019. The Neural Basis of Combinatory Syntax and Semantics.” Science 366 (6461): 62–66.
Salakhutdinov, Ruslan. 2015. Learning Deep Generative Models.” Annual Review of Statistics and Its Application 2 (1): 361–85.
Scarle, Simon. 2009. Implications of the Turing completeness of reaction-diffusion models, informed by GPGPU simulations on an XBox 360: cardiac arrhythmias, re-entry and the Halting problem.” Computational Biology and Chemistry 33 (4): 253–60.
Solé, Ricard V, Bernat Corominas-Murtra, Sergi Valverde, and Luc Steels. 2010. Language Networks: Their Structure, Function, and Evolution.” Complexity 15: 20–26.

No comments yet. Why not leave one?

GitHub-flavored Markdown & a sane subset of HTML is supported.