Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Prior to the Internet, cipher schemes existed that could delay or foil the best government analysts, and that took significant government efforts to even attempt to crack. It was never illegal to use such a code and transmit the result on a letter, postcard, or phone. And a government warrant would not compel the decipherment of such a scheme, unless someone had possession of a physical key usable for decipherment.

The two differences now: we've made that technology available to non-experts, and we hope that no amount of government effort can crack the scheme (as opposed to obtaining the key).

I agree with your comment that it's not a "fake crisis", though. It's one with only one right answer, but it's a "crisis" in the sense that no possible path forward will make both parties happy, so we fundamentally need the government to either realize they're wrong or to lose. Government positions don't change easily, and governments do not like to lose.

While it bothers me to see terms like "common ground" used, as they imply that both positions have grounds worth considering, I do think one of the few paths that has a hope of success is to convince the government that there exists a position they can adopt that doesn't look like it goes back on their current stance.



It's mostly not about banning encryption though. The debate mostly centers on regulating the products and services companies can provide to facilitate such communications. If there was a precursor to companies promising to make it easy for regular joe criminal to encrypt his postal mail, it's conceivable that the government would have fracked down on those companies.

> It's one with only one right answer, but it's a "crisis" in the sense that no possible path forward will make both parties happy, so we fundamentally need the government to either realize they're wrong or to lose. Government positions don't change easily, and governments do not like to lose.

It's not the "government" versus "the people." It's a small group of people who strongly support surveillance, a small group who strongly oppose it, and a mushy middle that tends to lean towards whatever makes them feel safe. People in each group are represented within government, though for obvious reason people in the first group tend to gravitate toward positions involving national security or defense.


> It's mostly not about banning encryption though. The debate mostly centers on regulating the products and services companies can provide to facilitate such communications.

Banning effective encryption, or banning commercial encryption, is still effectively banning encryption, or forcing it underground and casting suspicion on it.

> It's not the "government" versus "the people." It's a small group of people who strongly support surveillance, a small group who strongly oppose it, and a mushy middle that tends to lean towards whatever makes them feel safe. People in each group are represented within government, though for obvious reason people in the first group tend to gravitate toward positions involving national security or defense.

The people within government who oppose backdoors have yet to be very vocal, or effective. I'd certainly love to see a large outpouring of support from government, to counter the level of support for the pro-backdoor position.


The former NSA head opposes encryption backdoors: http://www.networkworld.com/article/2990294/security/former-.... So has the current NSA director: http://www.digitaltrends.com/computing/nsa-director-actually.... The White House Office of Science and Technology also opposed the FBI's proposal, which led to the White House tabling the idea: http://www.nytimes.com/2015/10/11/us/politics/obama-wont-see....

Many people in the tech community don't want to get behind those in government who oppose back doors because they are not absolutist in their rhetoric about privacy. They want to assert there there is not even a debate to be had between security and privacy and that privacy should always win. But when the vast majority of the actual voting public is concerned just as much about safety as about privacy, if not more, that's not a tenable position for elected and appointed officials to take.


Framing the issue as security/safety versus privacy would accept the rhetoric that keeping encryption legal and non-broken would reduce security/safety. (Which aligns with the similar rhetoric about pervasive surveillance.)

The comments by the NSA director didn't seem to have any significant effect on the general government message. As for the White House position (which I had not seen the announcement of, so thank you for the link), they said they won't seek legislation, but they hardly need to at this point; those in Congress seem more than happy to keep proposing such legislation, and I've seen no suggestions that the White House would veto it if passed.


> The debate mostly centers on regulating the products and services companies can provide to facilitate such communications.

You're calling them products but the relevant thing they want to regulate is still more speech.

If you want to communicate with your friends in code then you first have to communicate the code itself. In this context the code is code, but code is speech.


Code is not speech any more than electronic circuits are speech. That is to say that in unique contexts when the code is itself a means of expression code may be speech,[1] but not when it is used to build something that enables communication.

[1] Bricks can also be expression in unique contexts. That doesn't mean that bricks are speech.


A brick or electronic circuit is speech when being used as a medium of expression. Code is speech all the time because it can't be anything else. It's pure information. You can't email someone a brick.

It happens we have machines that will turn that information into action, but the code isn't the machine or the action. It's just a type of speech that machines can understand too.

People are always wanting to regulate speech by combining it with a machine, but the machine and the speech are separate. They don't have any specific relationship. Apple makes a) a general purpose computer and b) computer software. But (modulo DRM/copyright) you could run that software on any general purpose computer and use that general purpose computer to run any software.

It's like trying to regulate what information you can print in a newspaper by claiming you're regulating the printing press.


A piece of encryption software is clearly both.

It is speech. If you wanted to you could even go find the source code and translate it into english in such a way that a relatively competent programmer could turn it back into code ("If the first bit in the byte is 1 then do .... other wise do ....").

But it's clearly also a tool. I've never read the source code to the software I use to encrypt my hard drive. It's unlikely that I ever will. I just care that it does the job I want it to.

Trying to say that it's either one or the other is silly. It's both.

But just because it's speech doesn't mean that the government might not have an interest in regulating it. The first amendment is not absolute. I can imagine a great many prima facie arguments supporting the idea of regulating encryption software. The fact that code is speech is not, in and of itself, a defense against any of them.

As with most cases of constitutional law, it comes down to weighing competing interests. Failing to acknowledge these varying interests fails to acknowledge the actual question at hand.


> Trying to say that it's either one or the other is silly. It's both.

I'm not trying to say that it's one or the other. I'm trying to say that there is no part of it that isn't speech. There is not a part which is a tool and a distinct part which is speech. The whole of it is speech. All you're saying is that it's possible to use pure speech as a tool. But what of it?

You can't win by talking about balancing because encryption software is meta. You can use it to distribute it. If people who are breaking no law have the right to be able to communicate without government surveillance then the government would have to violate that right universally to enforce any rule restricting the distribution of software, because distributing software over a secure channel is indistinguishable from any other communication of the same size. It's hard to imagine anything that could justify that level of intrusion, and certainly not anything that has been proposed as a countervailing interest in this context.


Code qua written expression is speech. The government can't ban, for example, the publication of encryption code. But that doesn't mean the government loses the ability to regulate the operation of a product just because the operation is implemented using code rather than electronic circuits.

As to your printing press hypothetical, I don't think it applies. Say the back door is something like "must keep the decryption key around in case a warrant comes in." That doesn't entail any modification to the "speech" coming out of the device, does it? So how does it restrict speech?


> But that doesn't mean the government loses the ability to regulate the operation of a product just because the operation is implemented using code rather than electronic circuits.

Maybe it helps to better define what you mean as the product whose operation is to be regulated. If it's the hardware, it's a general purpose computer that can run any software. We quickly go to a bad place if you can't buy such a thing, e.g. side-loading on Android is prohibited, Raspberry Pi and RISC-V are prohibited, every device must prevent you from compiling a custom program and running it, etc.

But if it's the software then you're banning the publication of encryption code.

Obviously the confusion stems from Apple being the go-to example and Apple not only providing both the hardware and the software but also actually enforcing the kind of restrictions on what software the user can run that would be unreasonable as a requirement imposed by the government on the entire market. Apple could [try to] prevent you from using encryption software on an iPhone, but it makes little sense to require only them to do that if anyone can still run it on an Android phone or PC. But the alternative is that nobody can buy anything capable of running Debian or OpenBSD (or even Windows).

Moreover, the operation of the product is the thing carried out by the owner, not the manufacturer. Apple makes a nice machine and a big detailed list of things you can do with it but the user is the one choosing which buttons to press.

> As to your printing press hypothetical, I don't think it applies. Say the back door is something like "must keep the decryption key around in case a warrant comes in." That doesn't entail any modification to the "speech" coming out of the device, does it? So how does it restrict speech?

So there are two questions here: One is, can I have the code that doesn't keep the encryption key? If not then it restricts the speech you can receive and that of other people who want to give you that code (perhaps so you can use it to communicate more sensitive information with them).

Then there is the key itself. Keeping the key (or sending it to Apple or Uncle Sam) is compelling speech. The key is also information and the key + ciphertext is equivalent to the plaintext. It's equivalent to a requirement that you keep the plaintext of all your communications.

It would effectively be compelled written testimony before the fact. Either you keep everything written down or go to jail for not having it.


> It's equivalent to a requirement that you keep the plaintext of all your communications.

> It would effectively be compelled written testimony before the fact. Either you keep everything written down or go to jail for not having it.

While I agree with you, that particular argument won't necessarily succeed, considering https://en.wikipedia.org/wiki/Sarbanes%E2%80%93Oxley_Act and similar laws regarding reporting and information retention policies. You don't want the government equating policies to protect user communication as equivalent to destruction of evidence.


I fail to understand this point. Are you saying that because you have to transfer the tool (encryption software) over a medium, the tool is now speech?

I am pretty...unsympathetic...to this interpretation.


> Are you saying that because you have to transfer the tool (encryption software) over a medium, the tool is now speech?

Not because you have to but because you can. It proves the "tool" is pure information. Speech.


"possible path forward will make both parties happy" -> We only have to make 1 party happy: the 'people', as the government (should) serve them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: