I would like to understand this more deeply, through following up the references in this piece:

Scott Aaronson, Is “information is physical” contentful?

“Information is physical.”

This slogan seems to have originated around 1991 with Rolf Landauer.

…There are many things it’s taken to mean, in my experience, that don’t make a lot of sense when you think about them-- or else they’re vacuously true, or purely a matter of perspective, or not faithful readings of the slogan’s words.…The late, great Jacob Bekenstein will forever be associated with the discovery that information, wherever and whenever it occurs in the physical world, takes up a minimum amount of space. The most precise form of this statement, called the covariant entropy bound, was worked out in detail by Raphael Bousso. Here I’ll be discussing a looser version of the bound, which holds in “non-pathological” cases, and which states that a bounded physical system can store at most \(\frac{A}{4 \ln 2}\) bits of information, where A is the area in Planck units of any surface that encloses the system – so, about \(10^69\) bits per square meter. (Actually it’s \(10^69\) qubits per square meter, but because of Holevo’s theorem, an upper bound on the number of qubits is also an upper bound on the number of classical bits that can be reliably stored in a system and then retrieved later.)

You might have heard of the famous way Nature enforces this bound. Namely, if you tried to create a hard drive that stored more than 1069 bits per square meter of surface area, the hard drive would necessarily collapse to a black hole.

TBC.