Featured Post

Congress Doesn’t Get Encryption Either

A few weeks back, President Obama made remarks about consumer needs for strong encryption versus law enforcement and national security needs for access to private data.  I complained at the time that the President simply didn’t get it — there’s no such thing as making the encryption “a little bit” weak.  Over the last ten days, it’s been Congress’s turn, and they don’t get it either.  On Monday, the House committee held hearings on the problem.  The politicians wound up in the same place as the President: the answer is — because the politicians say it must exist — some compromise that gives law enforcement access but is so strong that the other bad guys are kept out.  They’re just as wrong as the President was about the existence of such a compromise.

The Senate found a new way to be wrong.  Last week, Senators Richard Burr (R-NC) and Dianne Feinstein (D-CA) released the proposed text for their update to the All Writs Act this past week. The AWA dates to 1789 and gives the courts authority to order all actions necessary to their function.  The AWA has been in the headlines lately because of a federal court order requiring Apple to assist the FBI in decrypting the contents of an iPhone used by one of the San Bernardino shooters.   The new mistake is language that requires tech companies to provide court-ordered data “in an intelligible format,” or provide the necessary technical assistance for the government to obtain the data in such a format. What does intelligible mean? The draft spells it out:

(10) INTELLIGIBLE. — The term “intelligible”, with respect to information or data, means — (A) the information or data has never been encrypted, enciphered, encoded, modulated, or obfuscated; or (B) the information or data has been encrypted, enciphered, encoded, modulated, or obfuscated and then decrypted, deciphered, decoded, demodulated, or deobfuscated to its original form.

The authors have tried to cover a lot of bases, and overreached as a result.  There is an implicit requirement that encodings of all sorts be reversible — that the original data can be recovered.  While ciphers fit that requirement by design, it’s not necessarily true for other types of encoding.  Some of them are “lossy” — data is intentionally discarded (lost) and can’t be recovered.


Three views of “Lena”, the most famous image in compression research. Technically still copyright by Playboy magazine, but Playboy has said they are not concerned about use of this subset of the original.

Image compression is an easy case. A raw grayscale image typically consists of eight bits for each pixel, representing different shades of gray from black to white.  The top image on the left is such an image. When encoded according to the JPEG standard, the size of the file representing the image is decreased dramatically. More importantly for this discussion, exactly reversing the encoding in order to recover the original data is not technically possible.  The middle image is a reconstruction from the JPEG version that was about one-third the size of the original.  To a human — or software mimicking human image processing, such as facial recognition — the differences are nearly indistinguishable.  JPEG is designed to preserve the important bits, but not the unimportant ones, from a human visual perspective.  But a JPEG-encoded image would certainly seem to be a technical violation of the draft statute: the original form of the data can’t be recovered.

(The bottom image compares the previous two, pixel by pixel.  In the comparison, the lighter the pixel the greater the difference in the corresponding pixels in the two images.  Most of the pixels are wrong, but in ways that don’t interfere with how humans see things.)

Images aren’t the only place this happens. For a time, “disemvoweling” objectionable comments was a fairly common blogging practice. The blog owner removed all of the vowels from the comment text, a form of obfuscation. “Mike is a silly twit” was replaced by “Mk s  slly twt”. The comment was still presented inline, and the content could usually be reconstructed, but that took some effort and most people would simply skip over it instead. This scheme renders both of the phrases “fiery lava” and “fairy love” as “fry lv”. In context, which expansion is correct may be clear. Sometimes it’s not. The token “lv” is particularly difficult here. Alive, lava, lave, leave, levee, live, love, and olive are all possible. As is the case with image encoding, the text obfuscation is lossy and cannot be reversed by a static algorithm. Human judgment is sometimes/often required.

Most frustrating about all of this is the implicit assumption that the bad guys are uniformly stupid. There’s an assumption that a terrorist leader, for example, will communicate in clear text using a cipher. Ciphers are only one method of passing secret information. A codebook, where a phrase is assigned a specific arbitrary meaning, is a different method. Phrases from a codebook are much harder to decode. Suppose that the authorities determine, from context, that the text “fry lv” should be taken as “fiery lava”. What does that phrase correspond to in the codebook? Execute attack plan “L” per previously agreed-upon schedule? Or run like hell, the FBI is on to us?

Expressed another way, citizens are being asked to surrender the benefits of strong encryption on the assumption that the bad guys are amateurs, not pros. And don’t have access to pros. But a different type of bad guy, the ones interested in stealing my data for their own financial gain, already employ pros. Very good pros. Let’s not weaken commercially-available encryption and make it easier for one kind of bad guy just because we’re afraid of a different sort of villain.  And in particular, if we’re going to go ahead and outlaw strong ciphers, let’s write laws that do that narrowly instead of potentially entangling a lot of other data manipulations.

Staff Writer

Michael is a systems analyst, with a taste for obscure applied math. He's interested in energy supplies, the urban/rural divide, regional political differences in the US, and map-like things. Bicycling, and fencing (with swords, that is) act as stress relief. ...more →

Please do be so kind as to share this post.
TwitterFacebookRedditEmailPrintFriendlyMore options

62 thoughts on “Congress Doesn’t Get Encryption Either

  1. Somebody please just make a free app for android and apple, host it some place like Pirate Bay, mirror the hell out of it, and make it open source. Screw the gov’t.


      • Yep. Zfone for Windows, Macs, and Linux. WhatsApp, for pretty much any mobile device. The Justice Department is already in court because they have a wiretap order they can’t implement — the party in question is using WhatsApp to implement end-to-end strong encryption, with the keys in the devices and not stored anywhere on the internet.


        • This is the larger point to be made. Congress can hand down whatever edicts it wants, it won’t matter a nit. Let’s work this through:

          1) I can install WhatsApp/etc on my mobile device. Boom, just got past whatever backdoor the fed made Apple or Samsung install.

          2) Fed tells all US companies to provide backdoors. Fine, bad guys get encryption tools from Europe, or Japan, or some talented guys in the Ukraine who run credit card scams on the side.

          3) Feds demand US phones can only install approved software. People get phones from overseas. Feds demand such phones can’t work in the US – right, our trading partners are gonna love that little bit. And this assumes a phone can’t be cracked to allow the installation of whatever the hell the user wants (and they will be).

          It’s a rabbit hole that goes nowhere good. The fact that Senators are even thinking about a bill like this tells me that they are either A) Signalling to voters and not actually serious; or B) Seriously out of their depth intellectually and have no one on their staff who can explain this to them.


          • Too many movies on hacking, I think. Congressmen aren’t technology experts, and they don’t grasp the one fundamental concept of computer security: Any exploit is a total exploit.

            A backdoor works for anyone. A hack works for anyone. Introducing any flaw, backdoor, hole, weakness, exploit, whatever into security is weakening it for everyone — good or bad. There is no such thing as a ‘secure’ backdoor.


              • What makes you think the staff includes encryption and security experts? Strange as it is, it’s a niche field. And the privacy experts (which they are far more likely to have on call) are generally pretty clueless on security.

                And what they’re faced with is the FBI, CIA, NSA, and police all screaming that this needs to be done on one hand, and a handful of software folks and manufacturers on the other side.

                And, to be blunt, any Congressmen with a hint of experience is going to assume both sides are lying at least some because that’s how it always works.

                Security and encryption are complex fields — and I think the very simplicity of some of the conclusions (“Any exploit is total”) works against it. It can’t be THAT simple, that’s just rhetoric (when is real life that clean? In this case, it is. Weakening security or encryption is adding a total vulnerability, to anyone, no matter what happens on TV)

                So you have cynicism saying the testifying experts are…stretching the truth…to benefit their side warring with the also-truth stretching law and order folks, and it ain’t just courtrooms where the police uniform wins all ties.


                  • But you do software and computers.

                    One of the reasons I was hired by the Colorado Joint Budget Committee staff director was as an experiment. Did it make any difference that someone on the staff was not intimidated by the IT people in the executive departments? And could explain assorted IT concepts to the Committee members? (Spending half a morning explaining “regression testing” to the Democratic Caucus is… well, I’m not sure what it is.)

                    My own opinion is that it was a mixed bag. The JBC was more receptive to budget requests for software systems when someone on their staff could translate between the department making the request, and the budget world. I was a valuable resource for other staff analysts. OTOH, it was a specialty expertise that, given limited staff, he couldn’t guarantee to the Committee he would maintain.

                    In the legislative staffing world, technical SMEs are a luxury that’s uncommon.


          • To the extent practical, I’d recommend Signal over Whatsapp.

            WhatsApp recently adopted the protocol developed by and for Signal, which improved security considerably. But Signal is a security project – its #1 priority, without which it has no raison d’etre, is security. The security features of Whatsapp are features way down on the priority list. Cynically, that’s probably doubly true now Whatsapp is a Facebook-owned enterprise.


  2. This is why Apple was wrong to make the stand they did in this instance.

    The FBI wasn’t asking Apple to decrypt the data on the phone or provide a tool to do so. All they wanted was access to the encrypted data so they could attempt a brute force attack. But they were locked out by a four-digit pin code — hardly a strong encryption key. The rub was that ten unsuccessful attempts would erase all the data on the phone, so they wanted Apple to apply and update to get around that particular feature.

    But they decided to make their brave stand for the privacy of the data of dead terrorists. This only a few months after JLaw’s cooter was splashed all over the Internet thanks in part to the ironclad security of iCloud.

    Exactly whose privacy interests did Tim Cook imagine he was protecting anyway? The phone is owned by the County of San Bernardino. They wanted the FBI to have access. This whole thing was arguably about shitty customer service; there certainly weren’t any 4th amendment issues in play.

    But make their brave stand they did. And as a direct and foreseeable result we now have a bunch of technologically ignorant ninnies in Washington proposing ill-conceived limits on consumer grade encryption. Thanks a lot, Tim. Asshole.


    • I disagree. Apple’s position was “we’re not helping you do an end run around our design”. That’s the position to take on every front. (The NSA likely had a backdoor/hack already). This was pure politics from the Administration/FBI. One should not reward stupidity by gov’t employees either.

      All efforts by the gov’t to obtain backdoors, hacks, end runs, etc. need to be resisted.


      • Hypothetical: A large corporate customer — e.g., GE or GM — approaches Apple and says, “We have a problem we’re hoping you can help us with. One of our senior managers unfortunately passed. There’s some critical information on an iPhone that we issued to him and we need access but no one knows the PIN code. Can you help us get into this? We’re certainly willing to pay you for your efforts.”

        What do you think their response would be? How was this different, at least initially? Remember, the court didn’t get involved until Apple told them to pound sand.


        • If your relative passes away, and you may need a locksmith service to access property (real or tangible).

          If the police ask for locksmith services to access property, that’s a different scenario with different rules. (and appropriately so).


          • Of course. But that’s (a big part of) my point. My understanding is that the phone is the actual property of the county government so it seems analogous to the first case to me. If I’m mistaken about the ownership issue then that would change the calculus.


          • “If your relative passes away, and you may need a locksmith service to access property (real or tangible). ”

            Presumably that property will not automatically set itself on fire if someone jiggles the lock more than four times.


        • We already know Apple’s response in such cases: “You’re big boys. We provide all of the tools for you to manage company-issued iDevices so that this can’t happen. It’s not our problem that you were too lazy/stupid/cheap to manage your assets.”


            • I’m not really talking about “responsibility” or “obligation” here. Rather, what realistically would you imagine the response of Apple be to such a request from a large customer? I mean, I realize that Apple and others seem to believe that iPhone is the very definition of mobile telephony but, in fact, Samsung exists and makes some damn fine competing products.


              • There is no way a large company, like GE or GM, doesn’t have AirWatch or some other MDM software on to pull it themselves. If they’re just handing phones out right out of the box, Apple would likely recommend that in the future they install MDM software to be able to pull data off. They might even recommend their enterprise partner, IBM, to set it up for them.


              • I would predict that

                2) if Apple did that for one corporate customer and word got out to others, their reputation for security would suffer considerably, and they would lose more business than they would gain.

                2) Apple knows that perfectly well.


          • I’m curious how you would or even could “know” this. Not saying you’re wrong, just saying I’ve seen nothing to confirm this. A big customer, a quiet request… seriously?


            • The fact that they didn’t have the software available to do this and requested information on how they cracked it would be clues that this is what they would say. They opened up the iCloud backups for the FBI because they had the tools available to unencrypt them. They did not open up the device.


            • I think you may be underestimating the willingness of big shops like Apple to stick to their guns on matters that are important to them – like an absolute refusal to knowingly build backdoors for their products.

              As mentioned here and there around the web, ordering such a thing to be built would probably cost them a lot of their best engineers – the really good security folks know they can walk away and find a job elsewhere, where they won’t be asked to weaken what they’ve devoted their careers and professional reputations to strengthening.

              Also, big customers are probably going to respect that, because many of them can see the big picture. If nothing else: if Apple did this for us, they are willing to do it for our competitor. And if they will do it for our competitor on a phone they legitimately own, they can be tricked into doing it on a phone they stole from one of our engineers. And if Apple doesn’t understand all that, they’re not competent to make the kinds of security claims they’re making.


    • A quibble: the phone is not “owned” by the County of San Bernardino. It is owned by the heirs of Syed Farook, who I hasten to note have committed no crime. The County of San Bernardino (or, I think more accurately the FBI) has custody, possession, and control of the phone until such time as it stops being potential evidence useful for the prosecution of past crimes or the detection and prevention of future crimes.

      With that said, dead people do not have the same privacy rights as living people do. To the extent that Farook’s privacy might have been at issue, the calculus of those privacy interests did diminish substantially when Farook died.

      To your larger point: do you really think the ninnies of whom you complain would have acted differently had Apple acted other than it did? It is in the nature of governments to seek to expand their powers until legal or political boundaries are detected; it is in the nature of democratically-elected politicians to publicly posture in ways that are to their political advantage and in these circumstances fear of violence and fear of the mistrusted minority of Muslims is readily-leverageable to political advantage.

      Protecting privacy and civil liberties is an intangible abstraction that will not translate well into votes, until our culture calms down about people with names like “Farook”. So I don’t think it matters at all what Apple did in court as a practical issue.


      • Does the County actually give up ownership of devices issued to employees? That’s not how things work at my (private sector) employer or at most others with which I’m familiar.


        • If Farook was a county employee and the county issued him that phone, then I must have forgotten that facet of the story, and you are correct. And if that were the case, Farook would not have had anything but a minimal privacy interest in the phone in the first place, even while alive.

          My remarks assumed that he purchased the phone himself.


          • The phone in question was issued by the county; the shooters destroyed their personal devices to the best of their abilities (which was a pretty clear indication that nothing of value would be found on this phone).


    • And if the county had managed their phones like intelligent people, this would not be an issue.

      How about we pass a rule that all government devices be managed, instead of attacking encryption?


      • That’s fine, but… isn’t that management system pretty much the sort of “backdoor” that everyone has their knickers in a twist about? As in “an alternative method of gaining access to the device and the information therein” and therefore — by definition! — a compromise of security?


        • The management system is a backdoor that can be activated and deactivated by the customer. The problem is with backdoors that can’t be deactivated by the customer. Those are just security holes, and competent device makers don’t engineer them in. To use a simpler example, a reasonable corporate policy could “backdoor” laptops by installing an administrator account with a password known to the IT department. By the same token, it would be crazy for Apple to put an unremovable administrator account with the password known to Apple on all of its computers.


      • Hell, this is one reason my company went to Apple for all work related phones–the higher security than android. (I believe) We’ve not had any real problems with this. Sounds more like the county didn’t manage their phones. This is not Apples problem.


  3. Feinstein is, as usual, an eager lapdog of intelligence and law enforcement agencies, but the bill is going nowhere.


  4. I was actually doing a little bit of research on quantum crypto the other day. If you use quantum crypto to exchange a one-time pad, you’ve got yourself the ability to send an uncrackable message. Like, uncrackable in theory. Like, it doesn’t matter if you have a beowolf cluster of Tianhe-2s.

    And, of course, the gummint is going nuts about this.

    Fun wikipedia links:
    Al-Kindi was the father of frequency analysis.
    Cipher disks and their cousin, Jefferson Disks.


    • You don’t even need that.

      Spend the money and buy a single professional grade random number generator.

      Sit there and fill up pairs of small flash drives with small files of random numbers, named themselves with name numbers. (Or MicroSD cards, if you want to hide them better.(1)) Most of them should be a single disk sector of 512 bytes, but you could have a few 10k files on there. (Most of the communication channels are apparently message boards, so they *already* can’t have large messages on them.)

      When someone needs to send a message, they pick a random file, XOR it across their message (Don’t even need a ‘program’ to do it, it’s probably a single memorizable command line in Unix), send the uuencoded the results of that, along with the name of the random file. Tada, unbreakable encryption. They then delete that file so they won’t use it again.

      The other end does the same, and deletes the file also, so they won’t use it when sending.(2)

      After a year, throw the thing in the microwave. (Deleting the files was not any sort of security measure, obviously.)

      There is no need to spend time and effort using quantum connection getting people their one-time pad keys. Just *hand* those keys to them, all at once. If you can’t figure out how to get agents in the field a *single* flash drive or microSD card, you are not very good intelligence agency/terrorist organization/whatever.

      1) Handling MicroSD cards used to present a problem, but now almost all laptops have SD card readers, and everyone has a microSD card in their phone so has an excuse to have an microSD-to-SD adapter laying around…hell, it came free with the MicroSD card they bought for their phone. Nothing suspicious at all there, and their secret microSD card can be hidden almost anywhere.

      2) Theoretical problem: Both ends use the same file at the same time. Very small odds of happening, but easy enough to give one side evens and the other odds.


  5. (10) INTELLIGIBLE. — The term “intelligible”, with respect to information or data, means — (A) the information or data has never been encrypted, enciphered, encoded, modulated, or obfuscated; or (B) the information or data has been encrypted, enciphered, encoded, modulated, or obfuscated and then decrypted, deciphered, decoded, demodulated, or deobfuscated to its original form.

    The list ‘encrypted, enciphered, encoded, modulated, or obfuscated’ is possibly the stupidest list I’ve ever seen in a law.

    First, they literally just forbid anyone from transferring that information to them electronically, because they don’t know what the hell ‘modulate’ means. Hint: It is not something you do to hide information, it is something you do to *transmit information* over a carrier wave.

    ‘I would email you the results of that warrant, but I’m afraid that would be *modulating* the information, in fact, it would modulate a RF signal to get the information to my wifi router, and then my wifi router would demodulate that, but modulate an electrical signal to get it to my cable modem, and then that cable modern would demodulate that, but modulate an RF signal to get it to to the ISP. So you’ll have to come get it in person. (Yes, my ISP would presumably demodulate my cable modem signal, but how can *I* be sure of that?)’

    Also, someone please explain how to turn over any videos or images or audio without them being ‘encoded’? Does Congress even know what the word *encoded* means? Those things start out analog and are then *encoded* into digital. The ‘unencoded data’ exists for a few nanoseconds work of wiring inside the camera or microphone until it hits a digital-to-analog converter on the board! No one has the unencoded data!

    OTOH, that would be a hilarious way to give it back to them: Here is a picture on my screen, which I have carefully decoded back into photons. No, you cannot have the file. That file is *encoded*, I have to give you the *decoded* picture, so here it is, on this screen. I guess you can take pictures? Also, hold on a second, I’m going to play some audio files for you…

    …no, I *couldn’t* have just played you the audio over the phone. Almost all phone transmission is *modulated* at some point.

    Also, there is no difference between the word ‘encipher’ and ‘encrypt’. To encrypt something *is* to run it through a cipher, you twits. That’s how computers encrypt things. (Technically, encryption can also refer to *non*-math-based hidden information, like spy phrase books, where ‘the library’ means ‘meeting point #2’ and ‘surprise party’ means ‘kidnapping’, etc….but any sort of commerial computer program doing that is extremely unlikely. Also, the word ‘encrypted’ covers all that.) Also, ‘encryption’ would include any sort of *encoding* done to hide information.

    Jesus Christ, you idiots, learn what technical words means before you use them.


    • The ‘unencoded data’ exists for a few nanoseconds work of wiring inside the camera or microphone until it hits a digital-to-analog converter on the board! No one has the unencoded data!

      Actually, thinking about that, that probably qualifies as ‘encoded’ also. The photons or sound vibrations were *encoded* into analog electrical impulses, and then *converted* into digital ones.


    • Also, there is no difference between the word ‘encipher’ and ‘encrypt’. To encrypt something *is* to run it through a cipher, you twits.

      Heh, somehow by the time I posted this I had forgotten that the OP covered this, sorta, without talking about the technical terms, so it sounds like I’m disagreeing with them and calling them twits. I was not intending that.

      My point is just that ‘encryption’ covers ciphering (Which is the only sort of encryption a computer is likely to do!), and it also covers any sort of ‘encoding’ used to hide information. (This encoding would done via…wait for it…a cipher. Ooo.)

      And you can just use the word ‘encryption’ to cover *all* those things, all sorts of ways to hide information, because that is literally what ‘cryptography’ is, and using that word would be a damn deal less stupid than, for example, including ‘encoding’ *in general*.

      Hell, character sets are a form of encoding, a way to map letters to numbers. Any electronic information is ‘encoded’! Time to fire up those printers.

      EDIT: Actually, even *human language* is a form of encoding, so now I have no idea what the hell they even *want* from us.


      • It’ll be formulated into regulations by actual human beings with theoretical domain knowledge who will divine the will of Congress.

        Which, horrible language and basic understanding of cryptography aside, is “Thou shalt never cipher what you cannot uncipher”. You know, if you provide ANY encryption you MUST keep the key so it can be encrypted.

        Which, BTW, could conceivably require every SSL transaction have the host store the key used (SSL is done by your browser and the host using public key encryption to agree on a key to actually encrpyt the communications channel, so that key is only for that session). It might outlaw public-private key encryption entirely or at least require that anyone generating a key-pair store the private key separately (with identifying tags) making the whole point moot. Oh wait, that just invalidated the certificate model that the internet relies on. Oops, internet just broke for shopping, banking, and “am I on the real website” purposes. Hope you didn’t like online shopping or banking.

        Oh, a common method for storing passwords online? Gone. One-way hashes would be illegal as a means to secure passwords, because, you know, “one way”.

        I suspect this bill won’t go anywhere, but it’s exactly the sort of first draft that makes me glad Congress is so slow at times.

        Even with the usual regulatory clean-up to turn it from “stupidly broad” into “regulations that actually do roughly what’s intended” there’s no fixing this. It’s WAY too broad, because the people who wrote it are thinking “Whole disk encryption” or “encrypted email” and didn’t realize they just outlawed the fundamentals of even the half-assed internet security.


        • Which, BTW, could conceivably require every SSL transaction have the host store the key used (SSL is done by your browser and the host using public key encryption to agree on a key to actually encrpyt the communications channel, so that key is only for that session).

          Actually, reading that law, alternately you could store all incoming headers and the entire outgoing response. Then you can provide the ‘never been encrypted’ data.

          This, of course, even *more* insecure than storing the SSL session key, and takes much more space. Although for static files, I guess you could just store the unique headers, and say ‘Here were the headers, and then I gave them this file’. (Although, really, it should be reasonable to say ‘Here is the generated html…if you want to know what the images look like, like if you want to know what http://example.com/images/blah.gif looks like, put the URL in your fricking web browser and download it, you idiots.’)

          You and I know this, but let me explain something about encryption for those who don’t: Encryption does not work the way you think it does, at least not for communication channels.

          Communications channels do a *whole bunch of nonsense* setting up a channel they know is secure, using really abstract and complicated math that proves they are both who they are and that no one can be spying on that channel, using rather large keys…and then the only thing they communicate over said channel is *a secret password* that they now both know. Tons of time and effort…to transmits a fricking 128 bit key that they then use to talk.

          For reference, the text in this sentence is 424 bits.

          They then open a *different* channel (Or, really, switch current one.) to both of them talking using that password.

          A password that no one ever knows. No one. It’s not stored anywhere, except in memory. It’s a different one for each user. Actually, it often is set to expire every X minutes and the channel comes up with a new one.

          When that key expires, and is discarded….literally no one, ever, can decode the communication.

          That is how 99% of secure communications work over the internet.

          Oh, a common method for storing passwords online? Gone. One-way hashes would be illegal as a means to secure passwords, because, you know, “one way”.

          Heh, I wonder how that’s going to work with the various digest authentication for IMAP/SMTP, or for HTTP. The server never even *gets* the ‘unencrypted’ password.

          But it’s even more hilariously stupid.

          Consider the word ‘obfuscated’. Look at it carefully.

          Now consider all those minified js and css files.

          Hrm. Better make sure you have the unminified code for jquery and whatever other libraries you’re using.

          And now ask yourself if a compiled computed program is ‘obfuscated’? I mean, that’s not how it came, and it’s much harder to understand this way. Is it obfuscated? Is it ‘enciphered’? (A semi-reversable mathmatical process was run on it to make it, after all.) And, last, is it ‘encoded’…actually, that’s not even a question, of course it’s ‘encoded’!

          Congress is trying to outlaw *selling computer software* by anyone who does not have the source code. Yes, it really really is that stupid.

          The entire concept of a computer is encoding things. Literally everything is ‘encoded’, because all the bytes in a computer represent *something*. (If they do not, there is no point to having them!) And while the encoding can *change*, like switching image formats, or you can add levels, like adding compression or encryption, things can’t be ‘not encoded’.

          This law is just gibberish, people using words they literally have no conception of. It’s like a law outlawing ‘nuclear or chemical processes that produce heat or cause entropy’. Guys…that is literally every chemical process, including people…and also you just said basically the same thing twice.


          • But when they wrote the ACA, they clearly meant the word “state” to refer to the government as a whole and not individual states, except for the times when they clearly meant it to refer to individual states and not the government as a whole.


            The whole thing is like talking about Dan Brown’s books. People will say “oh gosh, I just finished Da Vinci Code, I never realized how the Catholic Church could totally do all that kind of stuff!” “What did you think about Digital Fortress?” “Aw, man, that thing was a piece of shit! Guy doesn’t know anything about that stuff, it’s like he made it all up!”


            • But when they wrote the ACA, they clearly meant the word “state” to refer to the government as a whole and not individual states, except for the times when they clearly meant it to refer to individual states and not the government as a whole.

              No, when they wrote the ACA, they clearly did not proof it as well as they should have, so included a part of it, which, if taken literally, would render parts of the law nonsensical, as those parts all seem to assume the law worked in a different way.

              That’s not anywhere near the same thing as having no idea what various terminology means.

              Now, let me be fair to them: I’ll let them list both ‘encrypted’ and ‘enciphered’ as a belt and suspenders approach, even though ‘encrypted’ includes all possible ‘enciphering’

              And the ‘obfuscated’, admittedly, is just me finding fault. I doubt they’d actually demand originals of code, so it’s not really a problem with the law. Although it’s a bit stupid in a different way, because the entire concept of ‘obfuscated’ things is that they work identical to the original. They just look strange if you look at the code itself, usually done either to save space or as a computer joke. No one is obfuscating things as a security measure, because obfuscating is, literally, not any sort of security measure! So it’s not like the stuff going to be *stored* obfuscated. Of course, the person turning it *over* might decide to obfuscate it at that time…just like they might decided to hand the information over on 5.25 floppy disks, which could be considered a form of physical obfuscation. Which results in the other side *still* getting the information, so you accomplished nothing but pissing off the court.

              But the other two words…the slightest bit of computer knowledge would have resulted in something saying ‘Wait, isn’t everything encoded? How do you have unencoded computer information? Isn’t, for example, textual information encoded in either the ‘American Standard *Code* for Information Interchange’, or in Uni*code*? Code is right there in the name!’

              And anyone using the term ‘modulated’ to talk about anything on a computer is probably talking out of their ass. 99% of the people using that term have confused it with digital to analogy conversion in general…and the rest are dumbass legislators, apparently, that think it’s something you’re going to be doing to entirely digital information! WTF?!

              Yes, computers actually *do* modulate things, as all radio communication operates via some modulation scheme, as does Ethernet, and probably other stuff. But it’s so low-level that no one talking about computers is going to be talking about it, unless they build Bluetooth chips or something. (Last time I remember that any end-user had to know anything about modulation, it was when modems were around.) It’s basically as if the legislators have demanded that information turned over to them be ‘de-voltage-converted, or turned over at the original voltage’. What is even the hell? That is not how computers work!

              There actually are terms of the art for they want. Real, exact terms.

              The result is called ‘ciphertext’, which is the result of a process called ‘encryption’, and what they want from people is called the ‘plaintext’. Those are the terms that cover everything they want, and don’t cover anything else. Thirty seconds of reading wikipedia about computer encryption would have *given* them those terms. They…did not bother.

              That is not the same problem as the ACA, which clearly was just the result of two different version of the thing being mashed together. You can think whatever you want about, and whatever you want about whether it should have been decided that way, but it’s pretty clear it’s not gross ignorance of whether states or the Federal government exists.


              • I can hope the proposal is rewritten to use proper terminology, because it will make explaining how that particular cat is so far out of the bag that it can’t even see the bag even if it used the Hubble so much easier.

                Now, I feel a bit for sad for the Congressmen. This is a legitimate issue with a legitimate impact on law and order (laying aside terrorism, secret courts, etc. I’m just talking the ordinary business of search warrants and regular courts. The War on Terror crap is it’s whole own ball of mess) and they’d like to make it possible to actually serve warrants and get information ordered by the courts.

                It’s just….there’s probably NO solution at all. And the “obvious” solutions are nightmare horrorscapes if you actually understand how it would work. It’s akin to seeing someone die of carbon monoxide poisoning and trying to outlaw air. I mean yes, that’d fix the carbon monoxide problem all right….

                I mean like we’ve said in this thread — just trying to require cleartext means reworking the entire SSL setup, rewriting the entire electronic transaction protocols used by ALL the banks (both interfaces to customers and each other), getting rid of about 95% of everything that requires a password — INCLUDING every major operating system (which does not store your log-in password in cleartext. None of them do. Mobile or desktop) — and requiring them to keep vast records of all the keys you’ve ever used.

                Which is a worse security nightmare.

                And fun fact — the invention of a real, honest-to-Gosh quantum computer would render ALL of that trivially breakable. Which is why security experts are rather feverishly trying to work out encryption methods that are secure against quantum hijinks. (QC’s one, special, magical ability is to factor large numbers pretty much instantly. And the slow part of breaking modern encryption is that it takes FOREVER to factor large numbers).

                Because that list of stuff this legislation would break? Commercially available quantum computers would break it even harder.


              • “That’s not anywhere near the same thing as having no idea what various terminology means.”

                You write this, and then go on to write multiple paragraphs epitomizing exactly the viewpoint I’m mocking.


  6. The note you make on the two types of bad guys is illuminating – it’s a testament to the power of the narrative of terrorism that at a point when it’s quite obvious that really significant cost accrues to US businesses and citizens regularly because of info theft and lack of proper encryption of data, the mental model of the Senate is still “let’s say there’s this Terrorist and we need to break his code, but our G-men can’t do it because encryption” – that the first order problem is the scenario they can imagine (from a film or TV show, maybe) rather than the real thing happening at scale.


    • If that’s the number every time a phone needs to be cracked, that works for me. As long as the cost of doing it is high enough that they’ll only do it when it’s really important, we’re pretty safe. It’s when the cost of doing it approaches zero that things get scary. People start to float questions like, “Why not just read everybody’s data all the time as long as nobody notices?”


Comments are closed.