Why is confidentiality broken?
One reason infosec is hard is that, like pollution, it is you you bears the cost of my laziess. Another is that there is a lot of unintuitive weirdness in how to be private, from the terminology, through the bad user interface design via the mathematical complexity that we face, and this engenders much despair and confusion. A third, I woudl argue, is that the first two are leverage by spies and wanna-be authoritarian governments and creepy business models to get our collusion in spying on ourselves. So, what can we do, practically, to ensure better “herd condfidentiality” for necessary social function?
It’s hard to explain to regular people how much technology barely works, how much the infrastructure of our lives is held together by the IT equivalent of baling wire.
Computers, and computing, are broken.
For a bunch of us, especially those who had followed security and the warrantless wiretapping cases, the revelations weren’t big surprises. We didn’t know the specifics, but people who keep an eye on software knew computer technology was sick and broken. We’ve known for years that those who want to take advantage of that fact tend to circle like buzzards. The NSA wasn’t, and isn’t, the great predator of the internet, it’s just the biggest scavenger around. It isn’t doing so well because they are all powerful math wizards of doom.
The NSA is doing so well because software is bullshit. […] Managing all the encryption and decryption keys you need to keep your data safe across multiple devices, sites, and accounts is theoretically possible, in the same way performing an appendectomy on yourself is theoretically possible.
I mean, information technology is complicated. look at what is involved in a SMS.
Or practically, as with Chris Palmer’s State of Security, which points out the specifics of our leaky-sieve information tools.
Here is a lucid explanation by Ross Anderson:
information insecurity is at least as much due to perverse incentives. Many of the problems can be explained more clearly and convincingly using the language of microeconomics: network externalities, asymmetric information, moral hazard, adverse selection, liability dumping and the tragedy of the commons. […]
In a survey of fraud against autoteller machines , it was found that patterns of fraud depended on who was liable for them. In the USA, if a customer disputed a transaction, the onus was on the bank to prove that the customer was mistaken or lying; this gave US banks a motive to protect their systems properly. But in Britain, Norway and the Netherlands, the burden of proof lay on the customer: the bank was right unless the customer could prove it wrong. Since this was almost impossible, the banks in these countries became careless. Eventually, epidemics of fraud demolished their complacency. US banks, meanwhile, suffered much less fraud; although they actually spent less money on security than their European counterparts, they spent it more effectively .
Information war combatants have certainly pursued regime change: there is reasonable suspicion that they succeeded in a few cases (Brexit) and clear indications of it in others (Duterte). They’ve targeted corporations and industries. And they’ve certainly gone after mores: social media became the main battleground for the culture wars years ago, and we now describe the unbridgeable gap between two polarized Americas using technological terms like filter bubble.
But ultimately the information war is about territory — just not the geographic kind.
In a warm information war, the human mind is the territory. If you aren’t a combatant, you are the territory. And once a combatant wins over a sufficient number of minds, they have the power to influence culture and society, policy and politics. […] The key problem is this: platforms aren’t incentivized to engage in the profoundly complex arms race against the worst actors when they can simply point to transparency reports showing that they caught a fair number of the mediocre actors.
Cryptography for normal people
Slamming PGP and the cryptopunk model of human behaviour it presumes, that is a cottage industry. Indeed in practice, encrypting is hard. But also look at the systems it is embedded in.
GPG and HTTPS (X509) are broken in usability terms because the conceptual model of trust embedded in each network does not correspond to how people actually experience the world. As a result, there is a constant grind between people and these systems, mainly showing up as a series of user interface disasters. The GPG web of trust results in absurd social constructs like signing parties because it does not work and creating social constructs that weird to support it is a sign of that: stand in a line and show 50 strangers your government ID to prove you exist? Really? Likewise, anybody who’s tried to buy an X509 certificate (HTTPS cert) knows the process is absurd: anybody who’s really determined can probably figure out how to fake your details if they happen to be doing this before you do it for yourself, and of the 1500 or so Certificate Authorities issuing trust credentials at least one is weak or compromised by a State, and all your browser will tell you is “yes, I trust this credential absolutely.” You just don’t get any say in the matter at all.
The best explanation of this in more detail is the Ode to the Granovetter Diagram which shows how this different trust model maps cleanly to the networks of human communication found by Mark Granovetter in his sociological research. We’re talking about building trust systems which correspond to actual trust systems as they are found in the real world, not the broken military abstractions of X509 or the flawed cryptoanarchy of GPG.
When someone says “assume that a public key cryptosystem exists,” this is roughly equivalent to saying “assume that you could clone dinosaurs, and that you could fill a park with these dinosaurs, and that you could get a ticket to this ‘Jurassic Park,’ and that you could stroll throughout this park without getting eaten, clawed, or otherwise quantum entangled with a macroscopic dinosaur particle.”
Anderson, Ross J. n.d. “Why Information Security Is Hard – an Economic Perspective,” 8. https://www.cl.cam.ac.uk/~rja14/Papers/econ.pdf.
Sarigol, Emre, David Garcia, and Frank Schweitzer. 2014. “Online Privacy as a Collective Phenomenon.” In Proceedings of the Second ACM Conference on Online Social Networks, 95–106. COSN ’14. New York, NY, USA: ACM. https://doi.org/10.1145/2660460.2660470.
Valentino-DeVries, Jennifer, Natasha Singer, Michael H. Keller, and Aaron Krolik. 2018. “Your Apps Know Where You Were Last Night, and They’re Not Keeping It Secret.” The New York Times: Business, December 10, 2018. https://www.nytimes.com/interactive/2018/12/10/business/location-data-privacy-apps.html, https://www.nytimes.com/interactive/2018/12/10/business/location-data-privacy-apps.html.
Whitten, Alma, and J. Doug Tygar. 1999. “Why Johnny Can’t Encrypt: A Usability Evaluation of PGP 5.0.” In Usenix Security. Vol. 1999. https://www.usenix.org/legacy/events/sec99/full_papers/whitten/whitten.ps.