# Hardware for neural networks

Neuromorphic computing, non-von-Neumann architectures, and other ways to compute for AI

November 16, 2023 — March 25, 2024

compsci

computers are awful

concurrency hell

grammar

number crunching

stringology

Placeholder, for thinking about the implementation and theory of computation as it has been perturbed by our increasing dependence upon neural models for computing.

## 1 Quantum devices

TBD

## 2 Optical devices

I am slightly familiar with Igor CArron’s work on optical processing using randomized linear algebra (Brossollet et al. 2021; Cavaillès et al. 2022; Mohseni, McMahon, and Byrnes 2022).

## 3 Ising machines

## 4 References

Aimone, and Parekh. 2023. “The Brain’s Unique Take on Algorithms.”

*Nature Communications*.
Brossollet, Cappelli, Carron, et al. 2021. “LightOn Optical Processing Unit: Scaling-up AI and HPC with a Non von Neumann Co-Processor.”

Cavaillès, Boucher, Daudet, et al. 2022. “A High-Fidelity and Large-Scale Reconfigurable Photonic Processor for NISQ Applications.”

*Optics Express*.
Gershenfeld, N. 1996. “Signal Entropy and the Thermodynamics of Computation.”

*IBM Systems Journal*.
Gershenfeld, Neil A. 2000.

*The Physics of Information Technology*. Cambridge Series on Information and the Natural Sciences.
Gershenfeld, Neil. 2011. “Aligning the Representation and Reality of Computation with Asynchronous Logic Automata.”

*Computing*.
Hooker. 2020. “The Hardware Lottery.”

*arXiv:2009.06489 [Cs]*.
Jaeger, Noheda, and van der Wiel. 2023. “Toward a Formal Theory for Computing Machines Made Out of Whatever Physics Offers.”

*Nature Communications*.
Mohseni, McMahon, and Byrnes. 2022. “Ising Machines as Hardware Solvers of Combinatorial Optimization Problems.”

*Nature Reviews Physics*.
Zhu, Lilak, Loeffler, et al. 2023. “Online Dynamical Learning and Sequence Memory with Neuromorphic Nanowire Networks.”

*Nature Communications*.