Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
UK law says provide key to encrypted data or go to jail (itworld.com)
164 points by bougiefever on July 3, 2013 | hide | past | favorite | 79 comments


In light of this, here's a fun paper/idea (+implementation) to consider:

"Neuroscience Meets Cryptography: Designing Crypto Primitives Secure Against Rubber Hose Attacks"

https://crypto.stanford.edu/~dabo/pubs/abstracts/rubberhose.... (summary; full paper (PDF): http://bojinov.org/professional/usenixsec2012-rubberhose.pdf)

Abstract:

"Cryptographic systems often rely on the secrecy of cryptographic keys given to users. Many schemes, however, cannot resist coercion attacks where the user is forcibly asked by an attacker to reveal the key. These attacks, known as rubber hose cryptanalysis, are often the easiest way to defeat cryptography. We present a defense against coercion attacks using the concept of implicit learning from cognitive psychology. Implicit learning refers to learning of patterns without any conscious knowledge of the learned pattern. We use a carefully crafted computer game to plant a secret password in the participant's brain without the participant having any conscious knowledge of the trained password. While the planted secret can be used for authentication, the participant cannot be coerced into revealing it since he or she has no conscious knowledge of it. We performed a number of user studies using Amazon's Mechanical Turk to verify that participants can successfully re-authenticate over time and that they are unable to reconstruct or even recognize short fragments of the planted secret."

The scheme/system is bound to be imperfect; but it is a nice angle of approach, so to speak, and hopefully we'll have more stuff of this kind in the near future.

edit / P.S.: from the paper / intro section:

    Readers who want to play with the system can check out the training game at brainauth.com/testdrive


It is interesting, but has little to do with the original article, as the police will kindly ask the accused to authenticate, instead of asking for the key.


You are right, when you look at it like that, my comment was a bit of a hijack. The threat model in question assumes that the attackers know that the person in question has a way to auth / is part of an encrypted / limited access system, which is a very sensible assumption (and is included in the police/accused scenario in any case.)

I was probably thinking something along the lines of, [this proposed system] + a way for the user to discretely convey to the auth/security system the info that they are being coerced, and the system authenticating them to a bogus user account / set of sensitive data. But this would be cumbersome and very difficult to implement given the design in question, probably.

It would have probably been more relevant to mention encryption systems with plausible deniability - e.g. TrueCrypt's hidden volumes [1] and Rubberhose FS [2].

[1]: http://www.truecrypt.org/hiddenvolume

[2]: http://en.wikipedia.org/wiki/Rubberhose_%28file_system%29


Sounds like Murakami's "Hard-Boiled Wonderland and The End of The World".

Even so, you can give the password by playing the game, which is why I would prefer to keep it in some form even I can't read and which is easy to be accidentally destroyed by the police when they seize your stuff.


This does not do cryptography at all. It authenticates that you know a "password" that the system already has to have to do the auth process. As a result anyone how has your system either has your private keys or can derive them trivially.


If by 'system' you mean the one the user is authenticating to, then yes, afaik. For one to have data encryption, it'd have to be coupled together with a data encryption solution.

If 'system' (in your last sentence) means 'user system' ('has' -> keylogger etc), it's not that simple afaik. There's no way for any eavesdropper (between client and server) to know (from the auth game data the server sends) which parts of the data/exchange signify the password. I think they partly addressed possible password/data correlation attacks over multiple logins -> sets of data, but not sure if in this paper, it's been a while. It's not that trivial as far as I take it.


I meant the former. As far as I recall, the latter may have been possible. Their security model required someone physically authenticate it was a human logging into the system. This was to prevent someone from attaching a computer and using it to fake the responses to the challenges. I assume they were in fact worried it would do so by "learning" the real answer, but I can't recall.


  July 13, 2012
A few of us kicked and shouted about this when this was proposed. If you'd like an example of how this is being abused, El Reg has a good article from 2009: http://www.theregister.co.uk/2009/11/24/ripa_jfl/


Good article!

> He returned to Paddington Green station as appointed on 2 December, and was re-arrested for carrying a pocket knife.

FTR, carrying a pocket knife is perfectly legal in the UK (assuming it was within a certain size).

> Officers bearing sub-machine guns broke down the door of JFL's flat. He rang local police before realising CTC had come for him. [...] JFL maintained his silence throughout the one hour time limit imposed by the notice. He was charged with ten offences under section 53 of RIPA Part III, reflecting the multiple passphrases needed to decrypt his various implementations of PGP Whole Disk Encryption and PGP containers. [...] In his final police interview, CTC officers suggested JFL's refusal to decrypt the files or give them his keys would lead to suspicion he was a terrorist or paedophile.

And my favourite paragraph:

> "There could be child pornography, there could be bomb-making recipes," said one detective. "Unless you tell us we're never gonna know... What is anybody gonna think?" JFL says he maintained his silence because of "the principle - as simple as that".

So he was jailed for remaining silent.


He was taken into custody for, among other things: Failing to attend multiple bail hearings; Falsely claiming that a passport that had been legally seized by the police was lost, bearing in mind he'd previous missed bail hearings by traveling abroad; Refusing to comply with lawful requests for access to his computer, etc, etc.

There is no '5th Amendment' in the UK, no right of silence and no Miranda rights. There never have been. We do have our own limitations on the rights of police officers conducting an investigation though. If officers have the proper warrants, I think it's reasonable that they are entitled to access to computer records in much the same way they are entitled to access to any other part of someone's property, business and private records.

He was willfully obstructive and obtuse, and suffered the consequences for it, but no more than that. The sectioning is of course a matter for concern, but it requires proper medical oversight and bearing in mind his previous history of mental illness there's no particular reason to believe it was malicious.


Actually there was a Right to Silence, in common law. It was removed by statue in the extremely controversial 1994 Criminal Justice Act (which many of us demonstrated against).


Ok, the situation is more nuanced than I might have indicated. There is a right to silence in criminal law, and you can remain silent in civil law too. You cannot be prosecuted for remaining silent, or directly legally coerced into giving a statement.

However, specific inferences can be drawn from your silence in some circumstances. For example not mentioning something in you statement to police that you later rely on for your defence in court can be taken into account.

So we don't have an absolute right to remain silent, and doing so under suspicious circumstances can get you in trouble, but you can't be prosecuted just for not making a statement.


I am pretty sure that if you don't provide a door key police will break door so same logic should follow that they have burden of enforcing entry


That's completely back asswards. They have the right to request and require access, as pecufied in their warrant, and if you refuse or are unable to comply then they have the further right of forced entry. They have no 'burden' to do so other than their general 'burden' to perform their duties.


IIRC it's illegal to armour your door so as to prevent determined entry though.


FTR carrying [some] pocket knifes is legal in England and Wales. It is a criminal offence in Scotland.

The UK consists of England, Wales, Scotland and Northern Ireland. 3 different legal systems (though England and Wales and Northern Ireland are similar).

Scotland is radically different with many things being criminal offences in Scotland but are perfectly legal in the other 3 countries, e.g. certain types of violent pornography are serious jail time in Scotland but legal elsewhere in the UK. Plus the knife thing, obviously. Children of 8 years are criminally responsible [i.e. /are/ prosecuted] in Scotland but it is older elsewhere in the UK.


Thanks! My knowledge is mostly from: https://www.gov.uk/find-out-if-i-can-buy-or-carry-a-knife

> It is illegal to [...] carry a knife in public without good reason - unless it’s a knife with a folding blade 3 inches long (7.62 cm) or less, eg a Swiss Army knife.

(it doesn't mean what you're saying is untrue!)


The case is absolutely awful, but he was jailed for refusing to hand over the key, which is (according to the article) exactly what he did. The article is very kind to the suspect here, and nowhere does it even suggest that he had lost the key or otherwise wasn't able to decrypt the files.


Since when is refusing to incriminate oneself punishable with jail time? Last time I checked it wasn't.

Refusing to provide encryption keys is the same thing. There might be illegal data, there might not be. It's the duty of the police to prove it, not the accused.

Innocent until proven guilty? Not in the UK.


This is the UK not the US. They don't have a 5th amendment.

In the US courts are split on the issue. Some say giving an encryption key is testifying, an act of the mind, and you can't force someone to testify against themselves under the 5th. Others say its like handing over a regular key, which you can be forced to do, because the 5th covers testimony, not everything incriminating. It was intended to prevent forced confessions.

Once you're in front of a court you don't get to keep secrets, with the exception of some narrow protections. This has always been the case in the Anglo-American system.


The debate is partly this: forcing you to produce an encryption key /also/ testifies that the drive belongs to you (or you had access to it).

You are not necessarily obligated to testify to that fact for the police, and unless they can demonstrate that the drive belonged to you (or you had access) through some other means, the production of an encryption key is tantamount to forcing that confession.

It's much like if there were a lock on a gun they found on the street: if they can't link the gun to you already, they can't demand you turn over the combination for the lock, because knowing such a combination would be a tacit admission to knowing about the gun.


I'd think it's also an issue of the 4th. Consider sharing a machine with different user accounts. Asking me to decrypt a drive just might be an unreasonable search, because outside of logical partitioning, there's no concept of "a user's files" when the police just use a disk copy tool to pull all files off the disk.


We don't have anti-incrimination laws. If you go in front of a judge, the judge has the right to all relevant information with which to make a decision. What happened here is that the judge asked for the keys and was refused them. Plain and simple. If you don't trust your judicial system to (eventually) get to the right answer, you've got bigger problems.


"there can be no doubt that the right to remain silent under police questioning and the privilege against self-incrimination are generally recognised international standards which lie at the heart of the notion of a fair procedure"

from http://en.wikipedia.org/wiki/Right_to_silence_in_England_and... (and cite-note 15)


Interesting case (http://www.bailii.org/eu/cases/ECHR/1996/3.html), thanks for the link.

The case concerns a claim that it was injust to make inferences as to the guilt of the appellant based on their choice to remain silent before the police and court. This appeal under Art.6 ECHR failed (though a claim of preventing access to an attorney succeeded). The court finding that there was no undue inference made, that any inferences as to guilt that had arisen out of the defendants failure to break silence were allowable.

I'm not sure this really helps so much as it seems as it appears to allow the [partial] curtailment of presumption of innocence.


It's interesting that the article conflates right to silence and self-incrimination, I thought they were kept separate. It mentions the exception for encryption but doesn't mention any exception for safe codes which is what my line of thinking was based on. That being said, I can't find a reference to that either...


> If you don't trust your judicial system to (eventually) get to the right answer, you've got bigger problems.

Of course we don't (completely) trust trust the judicial system. That's the point of half the Bill of Rights, for one.


nemo tenetur prodere seipsum.


So this is one of those laws that everyone broke.


nope, it's one of those laws that the police can use on anyone they don't like, anytime. There was a good article of how this "criminalise everyone, and then only prosecute the ones you don't like" attitude was essentially how the US operates by default nowadays - It was on HN a few months ago but I can't find it now.


Think that was the point of the laws everyone broke.


apologies if I wasn't clear; there are plenty of laws "still on the statute book" which everyone breaks but which would never be prosecuted. e.g. copying music from CD to iPod before about 2010, or compulsory archery practise with the parish vicar on the village green on Sundays.

I was drawing the distinction between that type, which are simply archaic, and this type, which have been designed to incriminate everyone.


there was a chap who had this law used against him a few years back, I don't remember what happened though but it was a big deal at the time.

It was to do with encrypting random data for physics or a game or something and the government got hold of it but because the data was encrypted one way he couldn't give them a key and he was going to go to prison for two years.

I've heard nothing about it since though with all this PRISM stuff and people encrypting their emails I'm surprised it hasn't resurfaced sooner.



It gets more interesting than that though. Under the Regulation of Investigatory Powers Act, if you decide to comply and hand over your encryption keys, or to, say, decrypt an email, you are legally obliged to not tell anyone that you have done so.

But there is nothing in the law stopping you from saying up front "the only reason I would revoke my encryption key without explanation is if I'm legally obliged to by the cops under the Regulation of Investigatory Powers Act". And when you do so, anyone with a brain can draw the relevant inference...


It's tempting to think of this as UK-specific issue, but countries like the UK serve as role models.

It might be prudent to start campaigning against the most egregious provisions of RIPA.


This is... interesting.

Considering how close UK and US are, it could go like this: raise public awareness about PRISM and the like, prompting people to encrypt their stuff. Now, imprison everyone who has encrypted files.

It's like adding a fluorescent agent in a medium to highlight bacteria.


You don't imprison everyone, you imprison a few unlucky citizens and the rest is shitting their pants "it could be me". Fighting a population would escalate, targeting single individuals spreads fear as they are just a few people whose case can be manipulated so the public opinion is divided. A difference between how regimes and 'democratic' countries operate.


Exactly. Regimes use direct oppression, while democracies use terrorism against people.


> Regulation of Investigatory Powers Act 2000

Incidentally this is the same law that is also being used to legally justify GCHQ's Tempora operation (the UK's PRISM), according to this Guardian article[1] and discussed on HN previously.

[1] http://www.guardian.co.uk/uk/2013/jun/21/gchq-cables-secret-...


I think it's quite frustrating that apparently nobody in this comment thread bothered to read the relevant laws.

It is a sufficient defence in law to state that you do not have access to the key file. The only requirements being that you can show some backing and that the prosecution cannot prove beyond a reasonable doubt that you do have access to it.


> It is a sufficient defence in law to state that you do not have access to the key file. The only requirements being that you can show some backing and that the prosecution cannot prove beyond a reasonable doubt that you do have access to it.

How do you prove you don't have access to a key to data that isn't actually encrypted? Do you need to keep sets of fake keys for sensor data that you lose, so you have a defense?


If the data isn't actually encrypted or it is otherwise not protected information then you wouldn't need to provide a specific defence.


Sensor data is indistinguishible from encrypted data. How can some law aplly to one and not be used to the other?


To elaborate on this:

Ideally, encrypted data is indistinguishable from random data, otherwise known as "noise". Sensor data, radio telescope data and so forth often contain lots of that: it's just a LARGE file of bits that seem uncorrelated. No one can prove that a multi-gigabyte file of recorded data contains that, as opposed to them having renamed super-secret.tar.gz to sensor-logs.tar.gz.

Since no one can tell the difference, there's a pretty reasonable fear that police could see data related to your hobby (dumping ROMs, analyzing data, etc) and say, "You need to decrypt this so that we can see that it doesn't have $(illegal stuff) in it".


Probable cause. If there's no evidence of any kind that the data is actually encrypted data vs random sensor data then there is no way for this law to be invoked.


Specifically there has to be evidence that the data had a prior 'intelligible form' before encryption, in the case of a file of white noise there can be no such evidence.


The problem is that probable cause doesn't mean it is actually encrypted. What happens when they have probable cause, but it is actually sensor data?


How do you show backing in a case like this? I say I don't have it and I show them in support that... ?

I'm not trying to be mean, by the way, I'd honestly like to know. ^^;


Out of the 100+ files and archives I have encrypted, I remember the passwords of about 25 of them.

I would so go to jail.


likewise. There's quite a lot of valuable data I have encrypted in various blobs that I'd love to get back, but have totally forgotten the passphrase. I keep them around because I hope that one day I'll have an epiphany (or bruteforcing/JtR becomes practical)

Bit of a liability though, if I ever come under suspicion of anything. I just can't bear to nuke them though.


Would the plausible deniability that comes with using a technique like TrueCrypt's hidden volumes help in a situation like?


It might, but to be perfectly honest most people don't keep partitions of random-looking data, or large files containing what looks like it. Your plausible deniability would be of the form, "I was getting ready to make a hidden volume there, filled it with random bits etc., but I never got around to actually making it."

I'm not actually sure that TrueCrypt lets you separate these two aspects of creating a hidden drive, but Linux's tools do. With LVM (to create volumes in volumes) you could create a partition which exists within an encrypted partition, so that it's full with random data to begin with -- but then you could plausibly have forgotten to do anything with it after your computer was up and running.

Large random-looking files are a bit different; if someone were to ask "what's this 10 gig file of random data doing on your hard drive?" it would seem hard to answer them. The only thing that I know people use that much random data for is testing an RNG for its statistical properties.


A normal (i.e non hidden) TrueCrypt volume is also by default filled with random data. With a hidden volume you first create the normal volume, which as part of that fills the file with random data, then create the hidden volume inside the normal volume.

One password decrypts the normal volume and another decrypts the hidden volume. However, with just the normal volume password you can't determine the existence of the hidden volume (as long as you take some precautions to prevent leaking of information about the hidden volume)


Ah, yes! Sorry, I'd forgotten that those existed as well.

I never really saw a deep potential for those -- the problem being that you cannot open the outer drive for writing without providing the password which enables the inner drive's reading, which means that you're constantly leaking that information whenever you're using the outer drive (which ideally would be relatively frequent, so as to justify that it's not masking a hidden drive. So I'd just totally forgotten that TrueCrypt could do that. My mistake.


Huh? Why do you think that? The normal encrypted partition can be used independently of the hidden partition. You just need to be careful to ensure that the free space of the outer partition is enough to contain the hidden, inner partition.


>Your plausible deniability would be of the form, "I was getting ready to make a hidden volume there, filled it with random bits etc., but I never got around to actually making it."

Your denial would be "When I last reformatted the drive, I used random overwriting."


Fair enough, it's a little simpler but of the same kind; it's the same "yeah I have this space which I'm not using, so what?"


Any good symmetric encryption algorithm produces data that looks random. You can't tell whether it's just noise or encrypted data.

I'd expect a sane judge to treat cases like these as circumstantial evidence — e.g. the police thinks that random data is really encrypted data, but has no proof. From what I know, it is difficult (though not impossible) to land in jail based on circumstantial evidence.

I guess if you use encryption tools that add unencrypted metadata headers (as in "this is a file encrypted using AES-256 in CBC mode"), then the evidence is stronger.


Not quite what you're talking about, but there was recently a couple of posts by the 'binwalk' author about differentiating encrypted vs compressed data (presumed header-less):

http://www.devttys0.com/2013/06/differentiate-encryption-fro...

http://www.devttys0.com/2013/06/encryption-vs-compression-pa...


Wow! That was really interesting. Thanks for sharing those; have you considered submitting one of them as a story?


The first was submitted a couple of weeks ago at https://news.ycombinator.com/item?id=5871927 but didn't seem to take root.

Hopefully there'll be a part 3 that gives an excuse to resubmit :)


It's exactly the opposite: any random noise could be assumed to be encrypted data and therefore you can be jailed for being unable to decrypt the noise.


Technically any random noise could be encrypted data.

You could just store a 1 time pad somewhere that decrypts it to cat pictures


That.

If you have that 1GB file that simply can't explain, just xor it with some cat photos.


Why not just keep all your encrypted files stored on a server that is not in the UK jurisdiction? Just SSH in to access your files.


It's an awesome weapon. Plant an encrypted file on somebody's computer, report to the police you saw that person was viewing child porn.


If you're going to do several illegal things to get someone in trouble you might as well just place child porn on their computer...


There are many ways you could legally put an encrypted file on someone else's computer. But if you use actual KP then you're putting yourself at unnecessary risk. Better to have something innocent so if you get caught before you finish your sabotage you can use the key to decrypt the data and show it was innocent.


Or put childporn in encrypted format. Just make sure the password is easy to break like 12345.


You don't even need to do that. You can just say that your music files/photos are really encrypted using stenography and you can't prove otherwise. They can't prove they are encrypted either,but I sure they wouldn't bother with such small details.


Same error as in the article, so for clarity/pedantry value: this is steganography (with a ga). Stenography is a fancy word for shorthand.


and its still encryption, and they can prove that there is information there. A couple projects (one from uMich's CITI lab) showed that you can tell with high certainty when an image has steganographic data added too it.


Nope. They can show _with a high certainty_ that the image is not quite usual; that's still a pretty long shot from _proving_ the irregularity is actually steganography (but that's where we leave the un/certainties of crypto and enter the realm of law).


That is a fair point, they can show it was altered and "likely" to have concealed data but not that it does. Which would leave a reasonable doubt one would hope. I joked with Peter Honeyman once that perhaps I should just add 'bite me nsa' as a secret steganagraphic message in all my images so that they would have something to read. He suggested that the people who make a habit of taunting the bears in the circus are the ones who show up in the stories about bear attacks :-)


here's what you could do. put an encrypted volume onto somebody's computer. call the cops, tell them he's got child porn on it. they seize his computer, he doesn't know the password.

what is the even the fucking point in having a password if the state can just ask you for it?


There is never under any circumstances in any modern country a situation where you are allowed to "win" a fight against Constitutional and democratically created laws. Asking "what is the point in having a password if the state can just ask you for it" is like asking "what is the point in having a gun if the state can punish me for killing people with it at will?"

Encryption is there to protect against thieves, hackers, and other unlawful surveillance. Using encryption isn't ever going to make you impervious to the legal discovery process.


>"what is the point in having a password if the state can just ask you for it" is like asking "what is the point in having a gun if the state can punish me for killing people with it at will?"

no it's not, bad analogy. in the US the state can't put you in jail for not giving a password (remaining silent).


Relevant XKCD: http://xkcd.com/538/




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: