I would like to understand this more deeply, through following up the references in this piece:

Scott Aaronson, Is â€śinformation is physicalâ€ť contentful?

â€śInformation is physical.â€ť

This slogan seems to have originated around 1991 with Rolf Landauer. [â€¦] There are many things itâ€™s taken to mean, in my experience, that donâ€™t make a lot of sense when you think about them-- or else theyâ€™re vacuously true, or purely a matter of perspective, or not faithful readings of the sloganâ€™s words. [â€¦]

The late, great Jacob Bekenstein will forever be associated with the discovery that information, wherever and whenever it occurs in the physical world, takes up a minimum amount of space. The most precise form of this statement, called the covariant entropy bound, was worked out in detail by Raphael Bousso. Here Iâ€™ll be discussing a looser version of the bound, which holds in â€śnon-pathologicalâ€ť cases, and which states that a bounded physical system can store at most \(\frac{A}{4 \ln 2}\) bits of information, where A is the area in Planck units of any surface that encloses the system â€” so, about \(10^69\) bits per square meter. (Actually itâ€™s \(10^69\) qubits per square meter, but because of Holevoâ€™s theorem, an upper bound on the number of qubits is also an upper bound on the number of classical bits that can be reliably stored in a system and then retrieved later.)

You might have heard of the famous way Nature enforces this bound. Namely, if you tried to create a hard drive that stored more than \(10^69\) bits per square meter of surface area, the hard drive would necessarily collapse to a black hole.

TBC.

## No comments yet. Why not leave one?