Follow

"The intelligence coup of the century"

For decades, the read the encrypted communications of allies and adversaries exploiting backdoors in silicon chips.

washingtonpost.com/graphics/20

Free and open-source silicon will allow everybody to audit the full chip design, from netlists down to layout.

Keep up with Free silicon!

@fsi As much as I'd like to believe open-source silicon might help with that ...

... the story we're looking at regards *mechanical* encryption, with engineers who worked on that directly failing *over decades* to detect engineered-in weaknesses. (With a few notable exceptions.)

The complexity of silicon + software is ... greater. As is the likelihood of concealing weaknesses and backdoors.

@dredmorbius

Quoting the Washington Post:

* "Crypto’s shift to **electronic** products [..]. Foreign governments clamored for systems that seemed clearly superior to the old clunky mechanical devices but in fact were **easier** for U.S. spies to read."

* "a **circuit-based** system could be made to appear that it was producing endless streams of randomly generated characters, while in reality it would repeat itself at short enough intervals for experts [..] to **crack** the pattern."

@fsi I believe that's demonstrating my point.

Allowing that Crypto's (closed-source) software was more capable of serving NSA intercept interests doesn't refute the fact that the mechanical system had been doing just that.

Free Software experiences with SSH (Debian time-of-day seed 86,400 values), OpenSSL, and others, shows that even unintentionally introduced weaknesses can persist for years.k

Long-standing suspicions of NSA "blessed" crypto seed values likewise.

Or Intel's HW RNG.

@dredmorbius

Are you basically saying "we were terrible for years therefore we will be terrible forever and there is no point in trying"?

Secure and stable systems are achievable in software (mechanical systems don't even compare, as they rely on security through obscurity and not cryptography), and open hardware is a necessary prerequisite. Currently the entire industry is crippled by proprietary software norms, and that affects even the foss projects, as they (only) have to adhere to the standards set by those norms. When we have international standards of quality and a number of companies competing to produce objectively the best implantations/builds of completely foss cryptographic software/hardware, with mandatory warranties and guarantees(just like any normal engineering industry), and they continue to miserably fail for several decades, then and only then it might be reasonable to doubt the feasibility of such systems.

@fsi

@namark Is that the most charitable interpretation of my argument you can conceive?

@fsi

@namark How about:

- Crytpography and surveillence avoidance are hard.
- Past proclamations that FS/OS systems will inevitably result in greater freedoms and less surveillance have ... proved premature.
- Security and freedom aren't products, they're processes.
- Complexity increases capabilities but reduces reliability.

Open Silicon may be a Good Thing. It may not be.

The assertion that it will necessarily benefit seems naive.

The past 60 years of infotech surveillence judge us.

@fsi

@namark Mike Godwin is writing right now about the questions of surveillance and free speech (these are tightly coupled concepts):
slate.com/technology/2020/02/t

Paul Baran, co-inventor of packet-based networks, wrote of the risks of comprehensive surveillance and monitoring, in the mid-1960s.

(His works are freelly online at RAND at my request several years ago: rand.org/pubs/authors/b/baran_)

Herbert Simon referenced the Holocaust in dismissing fears. IBM proved him wrong: mastodon.cloud/@dredmorbius/10

@fsi

@namark And of course, there's Shoshana Zuboff who's noted the apparently inextricable link between information technology and surveillance and control: #surveillanceCapitalism

A book reviewed, as it happens, in the #WholeEarthCatalog "Signal" edition, edited by one of those early cyberpunk prophets, #KevinKelly

streettech.com/bcp/BCPgraf/Str

@fsi

@dredmorbius

I have no doubt of your good intentions, and if my argument seemed like a personal attack, I apologize. I am not talking about privacy issues in general, but specifically technical implementation of cryptographic systems, my main argument being that your pessimism about their feasibility is unfounded.

Now to continue being the abrasive pedant that I am:

- Crytpography and surveillence avoidance are hard.

Cryptograpohy and surveilence avoidance are two different thing, and the OP was about a critical cryptographic system failure. They didn't manage to somehow steal some keys or break into a house with some state of the art metaphysical mambo jumbo(I'm looking at you, quantum computer!), they just had a master key to every house and people didn't even have a clue.

- Past proclamations that FS/OS systems will inevitably result in greater freedoms and less surveillance have ... proved premature.

Can you give me an example of free and open source software running on free and open source hardware today? Fully free and open source systems have never even been deployed on anything but passionate hobbyist scale, and my whole argument was that in industry crippled by proprietary software norms, even the few existing FOSS projects are measuring up to the same norms, and their success or failure doesn't prove anything. But I guess you just TLDR, as you repeat this argument verbatim.

- Security and freedom aren't products, they're processes.

The only thing that is a process there is the human factor, again the OP was not about the human factor, it was a most embarrassing system failure. The system themselves, as a whole including the hardware, are most definitely products.

- Complexity increases capabilities but reduces reliability.

That's what a caveman would say about building a skyscraper(or even a humble 5 story building) looking at a bunch of others miserably failing to put a tent up. A very compelling argument in that setting, except that it is false.

@fsi

@namark @dredmorbius @fsi Excuse me for jumping into the conversation with a nitpick, but

> Complexity increases capabilities but reduces reliability.

is not entirely correct — more capabilities usually indeed cost complexity (and thus reduced reliability), but not vice versa — increased complexity is not necessarily more capabilities.
Metaphorically speaking, complexity is just the measure of how much energy an engine consumes, not how it performs.

@amiloradovsky Fair point. I was short-cutting the longer argument:

Capabilities of sufficient richness have a minimum complexity requirement. More complexity is a necessary but not _sufficient_ requirement for such capabilities -- it's possible to have complexity _without_ deliviering a specific property. It's _not_ possibly to have that property without the requisite complexity.

Complexity -- more parts, interactions, minimum tolerances -- increases odds of failure.

@namark @fsi

@dredmorbius @namark @fsi Yep, I can only add that the notion of complexity itself is very evasive… Sure, intuitively it's clear that a non-existent system can't perform it's function. OTOH, proving a lower bound for the complexity is virtually impossible. And what exactly we mean by the complexity as a quantity is a separate and deep question: asymptotic complexity of an algorithm or digital circuit, #Kolmogorov's complexity of a string, something else?

@amiloradovsky Carl Zimmer of the NYT posted a story some years back on a computer simulation of problem-solving behaviours where an "organism" (simulated) had to "evolve" a set of capabilities to solve some problem (maze traversal IIRC).

There was no complexity cost constraint (so highly-complex organisms evolved), but success required some *minimum* set of abilities which showed up in the experiment. Statsitically demonstrable if not logically provable.

@namark @fsi

@namark Not having the benefit of your toot length, I'm going to reply to four specific points:

1. Cryptography and surveilence avoidance are two different things....

I'm not claiming they aren't. I'm claiming they're _hard_. You're having an argument I"m not making.

2. ...give me an example of free and open source software running on free and open source hardware...

Again, not my argument. Present surveillance (capitalist/state/other) systems run on free software.

@fsi

@namark Free software hasn't prevented the most capable, pervasive surveillence regime ever devised from being created. It's fully facilitated it.

Open hardware will all but certainly do likewise, if it has any impact.

3. ...The only thing that is a process...

You're either failing to grasp my meaning or ignoring it and arguing something entirely different.

Security and crypto aren't objects or products or states, they're _emergent properities_ and _capabilities_ of complex systems.

@fsi

@namark 4. ...That's what a caveman would say...

That's not even an argument, it's an ad hominem dismissal.

I refer you to Joseph Tainter's work and definition of complexity.

@amiloradovsky 's point that complexity does not *necessarily* increase capabilities is well taken. There are some functions which do however have a minimum complexity floor.

@fsi

@dredmorbius

1. Sure everything is hard. My point was cryptography is hard in the same way as building a 5 story stone building is hard, while surveilence avoidance is "solve human beings" kind of philosophical problem.

2. The question was mostly to demonstrate that you would have a hard time coming up with such an example, because free software/hardware was never adopted in any significant way, so your argument that it was and it didn't work is not true. Free software being abused by existing/new monopolies to further their goals is not free software facilitating it. What happened would have happened regardless, and what little freedom and privacy we have today in software is thanks to free software. If free software ideals were truly adopted and not circumvented in every way possible the problem might have been more or less solved by now. I don't see how this comes in conflict with any governance system. Not being able to blindly follow the exacty same practices as before in a completely new industry is not exactly an unexpected and unresolvable conflict. Why can't a surveilence (capitalist/state/other???) system run on free software/hardware? My whole point was it can and it should, and if everyone does the absolutely embarrassing situations like the one in the OP can be avoided.

3. Well we might be arguing different things, but from my point of view it's you, who is bringing in philosophy and politics into purely technical and much more severe issue. In the context of the issue OP presented there were no vague _emerging properties_ or _capabilities_ or anything else as complex as you are trying to imply. It was a simple failure of a very real and tangible system, it's literally the locks to your house being master keyed without your knowledge, not because it was hard to figure out, but because you were not allowed to look inside or have a local locksmith look inside and that was(still is) considered normal. And because it is considered normal there actually are no local locksmiths who would evaluate even "free and open" locks, because it's not a profession one could make a living off in a world where people are not allowed to look inside most locks.

4. It was not an ad hominem, it was an exaggerated analogy. I threw in the caveman for some extra spice not a central point of the argument. A modern 5 story building is much more complex than a tent or a cave. It is BOTH much more capable and much more reliable, even if you include human factors... not philosophy though, I guess a philosopher would argue that a building is a "simplification" of a mountain with caves, and a tent is way more "complex"(or "reliable" take your pick of nosense) than either of those.

@amiloradovsky @fsi

Sign in to participate in the conversation
Mastodon

mastodon.f-si.org is one server in the network