Skip to Content
Uncategorized

Encryption Program Used to Show ‘Criminal Intent’

Great good God. It seems that having an encryption program on your computer has the potential to become evidence that you are doing something illegal. My old colleague Declan McCullagh has written a story about a man standing in trial…
May 25, 2005

Great good God. It seems that having an encryption program on your computer has the potential to become evidence that you are doing something illegal. My old colleague Declan McCullagh has written a story about a man standing in trial in Minnesota for child pornography.

An appeals court upheld the trial judge’s ruling that the state could introduce evidence that the man used Pretty Good Privacy (PGP) encryption software on his computer, and that the existence of that application could be used to show that the man has criminal intentions.

From the article:

“We find that evidence of appellant’s Internet use and the existence of an encryption program on his computer was at least somewhat relevant to the state’s case against him,” Judge R.A. Randall wrote in an opinion dated May 3.

Randall favorably cited testimony given by retired police officer Brooke Schaub, who prepared a computer forensics report–called an EnCase Report–for the prosecution. Schaub testified that PGP “can basically encrypt any file” and “other than the National Security Agency,” nobody could break it.

I am loathe, of course, to leap to the defense of someone involved in child pornography. And, in fact, I’m not going to do such a thing. Even criminals holed up in jails treat these people as the lowest form of scum, and I have no reason to argue. However, this case isn’t about child pornography. The ramifications reach far and wide, into the daily lives of those who use any types of digital technologies. After all, if a precedent is set that anyone using encryption can have the mere fact that files are “locked” held against them in a court of law, then by its very nature, our legal system becomes one where defendants are presumed guilty and must – by opening their files – prove that they are innocent.

Let me pose a series of questions around this subject: Does locking your home present a case that there is some criminal activity going inside? Is denying a police officer entry into your house, if they show up, enough to justify a warrant to enter your house? Does keeping a safe locked in your study prove that you are hiding documents relating to something criminal?

I would argue that, in each case, the answer would be no. And, here’s why. From the Fourth Amendment of the Constitution:

The right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.

This argument, of course, doesn’t take into account what information is actually in the home, in the safe, or in the encrypted documents. That information, if legally obtained, can certainly be used against someone. But merely the appearance of impropriety, based on a computer application that is used to keep documents private, certainly doesn’t seem grounds for proving criminal intent.

This argument, though, does lead into a thorny bush, since the Digital Millennium Copyright Act makes it illegal to break encryption. I wonder, then, if the state obtained a search warrant for the encrypted information, could the courts compel the defendant to open the files? And, if the court did just that, and the defendant decided to invoke his Fifth Amendment rights to not self-incriminate, could the state then attempt to decrypt the files, an act that, on first blush, would seem to violate another federal law?

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

The problem with plug-in hybrids? Their drivers.

Plug-in hybrids are often sold as a transition to EVs, but new data from Europe shows we’re still underestimating the emissions they produce.

Google DeepMind’s new generative model makes Super Mario–like games from scratch

Genie learns how to control games by watching hours and hours of video. It could help train next-gen robots too.

How scientists traced a mysterious covid case back to six toilets

When wastewater surveillance turns into a hunt for a single infected individual, the ethics get tricky.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.