Stack Exchange is a site, or a collection of sites, where people post questions on various subjects, other people post answers and yet others vote on whether they like both the questions and the answers. High-voted posts “float to the top of the heap”. Here is a post from the “Information Security” stack exchange that recently “trended” as one of the most popular questions overall: How to explain that “Cryptography is good” to non-techie friends. And here are extracts from the two topmost answers in terms of votes:
“If lack of encryption allows FBI to catch terrorists, then lack of encryption allows criminals to loot your emails and plunder your bank account.”
The rational point here is that technology is morally neutral. Encryption does not work differently depending on whether the attacker is morally right and the defender morally wrong, or vice versa.
and
I would take their argument and replace “cryptography” with “locks and keys on our houses” and see if they still agree:
If more terrorists and criminals would be caught by not having locks and keys on our houses, I would not blame warrantless searches by government and companies in our homes.
I know little of cryptography, but those arguments seem good to me.
The late, great Auberon Waugh once retorted to an argument that doing X (which increased liberty) would make things easier for criminals by saying that one might as well argue against a Zebra Crossing as it would make it easier for criminals to cross the road.
If the only ‘evidence’ that you have that a crime has been committed is evidence that is encrypted, has there really been a crime at all?
The problem, Mr Ed, is that there is often much evidence a crime has been committed, but the only evidence bearing on the perpetrator’s identity is encrypted. We are all criminals – there are far too many laws for us to even know, let alone obey. Governments, with street cameras, stingrays, license-plate readers, and a thousand other methods know plenty about us, and our crimes. All they need do is decide we need to be gotten, search the databases, and a multitude of broken laws will appear. (Check out Harvey Silverglate, Three Felonies a Day.)
Mission creep happens. Give the FBI, CIA, and LSMFT the right and the tools to decrypt felons’ data. Next thing you know, they’ll be looking for keywords like “Tea Party” or “patriot” in our e-mail. And then our taxes will get complicated. That’s just the entering wedge. If they want to decrypt me, let them sweat over it.
LSMFT
Wow. Talk about dating yourself. I assume we ARE talking about the sooper-sekret lab LSMFT that appeared in Mad Magazine’s spoof of the movie “Fantastic Voyage”?
What was it? “Lab Section for Making Folks Tiny”.
(a repurposing of the Lucky Strike cigarette slogan “Luck Strike: Mighty Fine Taste”)
It was “Lucky Strike Means Fine Tobacco”.
“I would take their argument and replace “cryptography” with “locks and keys on our houses” and see if they still agree:
If more terrorists and criminals would be caught by not having locks and keys on our houses, I would not blame warrantless searches by government and companies in our homes.”
The analogy works against the argument for crack-proof encryption: while it’s true that locks on doors keep out the criminal, they do not keep out the government if it has a warrant. This was essentially the FBI’s claim against apple – not that it not have encryption (that is “no lock on the door”) but that a “key” to that door be made available to the government when a warrant is produced. The government’s argument matches your analogy.
That said, I don’t trust any government with that “key”
I would have thought the best analogy would be with guns, i.e. if you ban/restrict cryptography then only the criminals will have it.
Balance of power between state and people is one important aspect of cryptography. Back in 2001 when 9/11 was the excuse for freezing the number of bits to be used, it was state of the art that was frozen. If George W Bush had told the NSA to monitor a domestic political opponent, that would have been a noticeable reallocation of resources from chasing terrorists, likely to become widely known within the organisation. 15 years later, Obama can do this with far less reallocation of resources, and thus less risk of whistleblowers.
(Compare for example, the effects of a second amendment that said, “The right of the people to keep and bear flintlock pistols and muskets, and Kentucky rifles, shall not be infringed” in 1816 versus 2016. Software technology evolves faster.)
As Runcie notes above, there is also the perennial failure of the hope that “something can be done to stop people inventing things” (the quote is from an initiator of the first Hague disarmament conference in 1907). More cryptography will be invented and those who break laws will get their hands on it much more easily than on forbidden firearms – consider the modern difficulties of enforcing copyright against pirates of songs, etc.
Finally, insofar as the nominal reason for government hostility to encryption is not a fraud, it is too obviously on the part of _some_ a massive displacement activity. No doubt ISIS is on average well behind us when it comes to software skills but they seem able to use the web for their purposes. My poem made no suggestion that the gossiping migrants are using chat rooms rather than their own grapevine, and if the German government has lost track of 130,000 of them, just how many (how few!) will it find again through breaking anyone’s encryption. Instapundit says he’ll believe in global warming when those who talk about it act like they do. I’ll believe Merkel, Obama et al restrict encryption to fight terrorists when their overall policies suggest more desire to fight terrorists than to enable them. (Of course, I do not doubt there are those in the NSA and elsewhere who do decrypt terrorists for the stated reasons.) If the full resources of the state let them track 100 terrorists maximum, because the latter were always not so far behind the state of the art, then at least that might tell them what numbers of immigrants from terrorist-recruitment countries would overwhelm them, and whether it was desirable for sponsor states to have atom bombs in their arsenal. And then again, perhaps Merkel and Obama would never learn that from any such real-world limitation.
(BTW, having researched the science, I’ll not fall for global warming even if its aficionados start acting sincerely, though I might less despise the scientifically educated ones among them. At least the terrorists really exist and are really dangerous and can sometimes really be found by decryption.)
Yes, those are good arguments.
Stack Exchange is quite a phenomenon, too. In programming it has largely removed the need to retain detailed knowledge about trivial specifics, leaving only the need for the real skills: exercising judgement about what techniques to apply to which situations. I can only assume it does (or can do) the same for other fields of knowledge. Including, perhaps, political debating.
Josh B: the government does not have a key to my house. If they produce a warrant, I must let them in – but it’s still me, with my key, that lets them in. This is exactly the same as “crack-proof” encryption: if the government has a warrant, they can ask me to decrypt my data for them.
If I refuse to unlock my front door, then it is up to the government to break it down. If I refuse to decrypt my data, it is up to the government to crack the encryption. There’s no difference between them being unable to crack my encryption and them being unable to break down my front door: surely no-one is suggesting that it should be illegal for me to build a really strong front door!
The suggestion that all encryption should have a “back door” for government use is like saying that the government should have a literal back door into everyone’s home. We must trust them never to lose the key, or accidentally allow a criminal to copy the key, which is ludicrous: of course the key will fall into the wrong hands eventually. Then what? All our homes are at risk.
(Note that I did not claim, as some might, that the government’s hands are already the “wrong hands”.)
It’s no good, Runcie Balspune. The kind of people you have to debate about this think only criminals and police should have guns.
Josh, that was not what the government demanded from Apple: in that case the company acted as the manufacturer of both the safe and the lock installed on it – what the government demanded was that Apple give it the master key to all the locks and safes the company makes, everywhere.
To your general point, it is legitimate for the government to demand entry onto private property, if it can show a reasonable indication of a crime having been committed – that’s why it has to obtain a search warrant from a judge. The system is not perfect, but works reasonably well most of the time when the government sticks to the rules (which, as I understand, was not the case with Apple). But the system not being perfect is not an argument against locking your property anyway, or having the legal right to do so.
I also thought of guns, but the equivalence is only partial, since locks don’t have lethal power. It would be applicable if, say, one used encryption which upon an attempt to break it would somehow “kill” the computer from which the break-in was attempted.
Rob Fisher, I’ve occasionally visited politics stack exchange. Though it might one day become a repository for evidence-based judgments on political questions it is a long way short of that now. The questions are a strange mixture of good but slightly obscure research questions that there are not enough forum members qualified to answer, and thinly disguised advocacy.
I think you are all fooling yourselves if you think that the NSA does not already have the capability of breaking key-pair encryption.
That whole fuss over breaking into the IPhone was smoke to cover the fact that they already knew how to do it.
The reason that the government does not make much more fuss about encryption than they currently do is because they want people (especially people they don’t like) to feel secure using that encryption.
The Apple case was not about decryption, it was about circumventing the auto destruct on the failed passcode attempts. There were another aspects to the demand from the government agency to Apple which would be to also remove the delay between failed passcode attempts and to enable passcode transfer via the various ports. This seems to indicate that the agency was just using good old-fashioned brute force decryption or something similar.
In cases where time is not an issue (such as recovery of information after the event), then there really is no form of encryption that can avoid continuous brute force methods, all it needs is available processing power, and it is reasonable to assume that the government has access to the most processing power for these situations, so the question about an encryption back door or not is moot.
As mentioned before, the real question is, assuming the government can crack our encryption no matter what, what they are entitle to use it for.
For a while, one of the pet questions of the illiberal left was, “what freedoms are you willing to surrender for the sake of being safer”. The proper response of course is that exercising one’s “negative” rights to the maximum degree makes no one less safe, not you, nor anyone else. What REALLY bears examination is the nature of their zero-sum presumption, that liberty is inherently unsafe, and presumably by the same argument, lack of liberty begets safety.
This is the mindset of statists, authoritorians, pets and slaves, i.e. those who wish for a lord and master.
The Feds v Apple was about an exercise of power, (force), not ability. They could (and did) hire someone to crack anything they want cracked anytime they felt like it. ‘No different than “the courts” trying to force someone to open a safe before they resort to bringing in their own safe cracker.
The Feds v Apple was a trail balloon.
I quote (cut and paste) from Wiki…
“In the United Kingdom, the Regulation of Investigatory Powers Act gives UK police the powers to force suspects to decrypt files or hand over passwords that protect encryption keys. Failure to comply is an offense in its own right, punishable on conviction by a two-year jail sentence or up to five years in cases involving national security.[7] Successful prosecutions have occurred under the Act; the first, in 2009,[59] resulted in a term of 13 months’ imprisonment.[60] Similar forced disclosure laws in Australia, Finland, France, and India compel individual suspects under investigation to hand over encryption keys or passwords during a criminal investigation.” ~ wiki/cryptography.
Alisa, locks can indeed have “killing power”, which is usually the go-to argument for the feds, i.e. “what if a terrorist had a bomb ‘under’ New York City?”. (this chestnut usually places the bomb “under” NYC for some reason).
This is also their “reason” to legalize government sanctioned torture.
As that is the go-to argument for the feds, it’s much like “think of the children” is the go-to so-called argument of the left, even the illiberal left.
All the above is part of their “rights are dangerous” song and dance.
And you know this how?
For those interested, here is a Numberphile lecture on YT from a Berkeley Math Professor, Edward Frenkel, purporting to explain the maths behind the NSA hacking emails. It takes some time to get to the point, which is elliptic curve cryptography and an indication that the entire cryptographic set-up was done with a compromised algorithm system provided by a US government agency, the National Institute of Standards and Technology, part of the Department of Commerce, and the NSA knew the ‘backdoor’ to the algorithm.
I’ve no idea if this is correct or even accurate, it was from December 2013.
Mr Ed: the possible backdoor in question was described by Bruce Schneier here. It’s definitely very suspicious but there’s no way to know for sure whether or not it was deliberate or, if it was, who did it.
More importantly, the weakness in DUAL_EC_DRBG is well known and as a consequence it isn’t used much. RSA used it (apparently after receiving money from the NSA) but not many other people. As discussed here, it’s extremely unlikely that this backdoor was ever actually used.
Anomenat,
Thanks for that, so this is a well-known issue.
The only human random number generator I know of is a politician explaining his tax and spend policies, but like Enigma, he will not encrypt a number as itself.
I don’t know it. That is why I prefaced my remarks with “I think”.
But I suspect it, based on the way the U.S. government stoutly prevented the export of strong encryption for years, and then suddenly folded, with a great showy sigh of reluctance; the way they continually make noise about the need to “do something” about encryption, but never actually do anything effective; finally, the fact that the encryption algorithms themselves are convoluted and complex, rather than simple and easily provable.
If “they” (shadowy spooks sitting in nondescript buildings in their plain dark suits) did in fact have key-pair encryption broken, I submit that they would behave precisely as they are behaving now.
“Mission creep” has already happened.
– Glenn “Instapundit” Reynolds
Hmmm … comments above to my comment are all very good. Although I was not arguing against crack-proof cryptography, only that the analogy was not an apt one, I think given the responses I may need to think on this one a bit more.
Comment on the Apple case from the YT Computerphile channel, nothing startling but some context.