Skip to Content
Uncategorized

The Documents in the Case

In cybercrime, they’re printouts–and not to be trusted

The letter from Carl Payne came in the spring of 1998. It was hand-written -no letterhead. I was suspicious. Being a columnist for The Boston Globe and the author of seven books, I get my share of communications from cranks, crazies and convicts. But Payne, I soon realized, was none of the above.

Payne wrote that he was the defendant in a criminal computer-hacking case. Back in December 1994, at the age of 28, he had helped start an Internet service provider in Utah that was eventually named Fibernet. But in the autumn of 1996 the board voted to oust Payne after he locked horns with the man who was poised to become Fibernet’s new president.

A week after Payne left Fibernet, someone had hacked into the company’s computers and ransacked their systems. Fibernet had immediately fingered Payne and persuaded the Utah County Attorney’s office to charge him with violating Section 76-6-703 of the Utah Criminal Code, “Computer Crimes,” a seconddegree felony. The prosecution had a pile of evidence, the case was going to trial, and he needed my help.

At first glance, Payne did indeed look like the likely culprit. Studies have shown that most computer crimes are perpetrated by disgruntled employees. Most computer-hacking cases that reach a courtroom pivot on some aspect of the law, such as whether the hack was illegal-and not on whether the suspect actually did it. I had never heard of a case in which the accused “hacker” maintained his innocence, especially in the
light of hard evidence. Yet that’s just what Payne was doing. Intrigued, I called him.

On the telephone Payne was talkative, friendly-and very worried. We agreed that he would send me all the evidence the County Attorney’s office had provided to his lawyer. I would assess its quality and write a report. If the case went to trial, and he still wanted me, I would come to Utah and testify. It would be my first stint as a paid expert witness.

A week passed and a thick packet arrived in my mailbox. It contained Payne’s account of the incident, the police report, depositions from all involved, and nearly 200 pages of computer printouts. After four hours spent poring over the documents, I emerged into the living room and told my wife: “Things don’t look good for Mr. Payne.”

Payne’s last day of work was October 30, 1996. On November 6, someone had logged into each of Fibernet’s main computers and started deleting files. Customer Web pages and e-mail were erased. Accounting information was wiped out. Then the attacker gained access to each of the company’s special-purpose communications computers, called routers, and deleted their programming. Ultimately, the company lost more than half its customers, laid off many employees, left its managers without salary, and nearly folded.

Payne, who had been Fibernet’s chief technical officer, certainly had the knowledge necessary to pull off the assault. And after his messy departure, he might have had a motive: revenge. Some other details also seemed to point in Payne’s direction: Among the several accounts utilized in the hack was one called “carl,”which presumably belonged to him, an account called “dbowling,” which belonged to one of his
friends, and one called “usenet.” Sometime prior to the attack, somebody had modified the “usenet” account and given it full system privileges, creating-to use the lingo of computer security-a “back door.”

But perhaps the most damning document in the package was the report of the police officer who had gone to Payne’s house following the attack. When the officer arrived, he found that Payne had reformatted his home computer’s hard drive and was reinstalling the operating system. In the trash can next to the computer was
a pile of floppy disks. The officer neither impounded Payne’s computer nor seized the floppies-he later testified in court that he had assumed any potentially useful evidence was already destroyed.

It all looked suspicious. But another call to Payne produced a different perspective. The last week he was at Fibernet, Payne told me, he had turned over all the company’s administrative passwords to the new president. The next day, Payne discovered that his password had been changed. On the morning of the attack, Payne said, he had tried dialing Fibernet on his modem several times, on the remote chance that his account had been somehow re-enabled, but he had never successfully logged in. In fact, he was reformatting
his home computer because it crashed every time Fibernet rejected his password. All those disks in the trash, he said, were old files he was getting rid of in preparation for a move to California.

I wasn’t sure whom I should believe, but I was starting to like Carl Payne. He could have been me 10 years ago-a technically savvy geek who had gotten himself in trouble with a bunch of suits who were more comfortable with spreadsheets than C compilers. Perhaps he did it, perhaps he didn’t. But a closer inspection of the computer printouts that made up the heart of the prosecution’s case convinced me that,
no matter who the culprit was, there wasn’t enough evidence to convict anybody.

For one thing, none of the printouts allowed me to pinpoint a phone number or computer from which the attack had been launched, let alone the identity of the perpetrator. And something else called the
evidence into even greater question: It appeared somebody had tampered with some of the files before printing them out. The log had small typographical errors-a few extra spaces inserted on one line, a
letter dropped on another-as if somebody had taken the original log files into a word processor and cut and pasted text before printing. This meant that the information on those pages was suspect. And why did all of this evidence come to me in printed form? Where were the original electronic records? Guilty or not, I
thought, no one should be convicted on the basis of tampered evidence.

I sent a six-page report to Payne, and continued to follow the case. In December, I boarded a plane for Utah. When I arrived at the Utah County Courthouse in Provo, the opening arguments had just concluded. The prosecution’s theory was simple: Carl Payne was a technically brilliant but hard-to-handle employee.When
Fibernet gave him notice that he was going to be terminated, Payne installed a back door that would allow him to wipe out the company’s computers after he left.

It turned out that in ousting Payne, Fibernet had fired the only employee capable of repairing the damage from the attack. So in addition to calling the police after the incident, they had called a computer consultant to come in and try to get the system back up and running. The consultant, Stacey Son, became the lead expert witness for the prosecution.

Son’s testimony explained why there were only 200 pages of printouts in evidence-Fibernet had hired him to get the system working quickly, not to document the damage for an investigation, so he hadn’t attempted to preserve potentially incriminating or exonerating files. Neither had the police, it turned out: The officer
who visited Fibernet and then searched Payne’s house testified that he had no experience with the UNIX operating system that Fibernet and Payne used. Instead of impounding computers and disks, the officer had simply accepted the paper printouts Fibernet had handed over.

On the stand, Son admitted that there was no way for him to tell the identity of the perpetrator. But the biggest hole in the prosecution’s theory became apparent when the defense questioned Son about the attack itself. It was poorly done, Son explained: Not enough information was wiped out. It seemed to me to be the work
of an amateur with only rudimentary knowledge of UNIX systems, not that of somebody of Payne’s admitted prowess.

The prosecution rested on Thursday, the third day of the trial. That night in my hotel room, I looked again over those critical printouts. The prosecution’s most important exhibits were pages 151 and 152, which showed each account’s name, user-identification number, group number, encrypted password, and a third number
for accounting purposes. The useridentification number had been the subject of much testimony, since its
manipulation was a critical step in creating the back door. Nobody had discussed the significance of the accounting number.

Friday morning I woke up in my hotel room at 5 a.m. I had a hunch about the elusive last number. I needed to check the documentation for the version of UNIX that Fibernet had been using. I didn’t have the manual with me, but I booted up my laptop and found it on the Internet; it explained that the number was used to warn people when it was time to change their passwords-it indicated the number of days between January 1, 1970, and the last time the password was changed.

I felt stupid. Here was possibly the most important piece of evidence in the entire trial, and I had not even realized it until the morning I was supposed to testify! Encoded in the record of each account’s password was the date the password had last been changed-by decoding the number, I could establish precisely when the “back door” was created. In the hours before the trial, I wrote a small program to translate the numbers.

What my homemade program showed me clinched the case. The back door had been installed on October 31st, the day after Payne’s last day of work-and after his access to the Fibernet system had already been cut off. Payne couldn’t have created it. What’s more, another account’s password change dated to more than two weeks after the attack, a detail that would be impossible if the printout was really the same one Son had made that day. This showed irrefutably that the chain of evidence had been broken.

At 10 a.m. I took the stand. I described my credentials, the proper handling of security incidents, the paucity of evidence, and the telltale indications that the printouts had been altered. Finally, I testified about what I had learned that morning. From that point, everything moved quickly. Payne and his wife testified, the attorneys gave closing arguments, and the jury began deliberations around dinnertime. In the late evening, they came back with the only verdict I thought they could reasonably reach: not guilty on all counts.

Today, Carl Payne oversees a large computer network in California. Fibernet, meanwhile, is thriving. In the course of the trial I came to believe in Payne’s innocence, but never felt that I had learned the real story. In closing arguments, the defense suggested a few possibilities: Somebody at Fibernet could have carried out the attack. An employee whom Payne fired in July of 1996 might have done it. Or perhaps the
crime was committed by some unknown hacker on the Internet, an unfortunate coincidence with Payne’s dismissal.

Fibernet, for its part, declined to comment for this article.

There’s really no way to know what happened, because the Utah police did not do a meaningful investigation. They simply asked the victim, “Who did it?” and Fibernet answered: “Carl Payne.”The company then provided all of the evidence used in the prosecution. The police never
would have followed such haphazard procedures in the wake of a physical breakin-they would have done their own detective work, carefully collecting and preserving the evidence. As more and more crimes occur in the neighborhood we call “cyberspace,” police need better tools and training. Without it, we risk bungled investigations and the very real possibility that innocent people will be found guilty for the hacks of others.

Keep Reading

Most Popular

Large language models can do jaw-dropping things. But nobody knows exactly why.

And that's a problem. Figuring it out is one of the biggest scientific puzzles of our time and a crucial step towards controlling more powerful future models.

OpenAI teases an amazing new generative video model called Sora

The firm is sharing Sora with a small group of safety testers but the rest of us will have to wait to learn more.

Google’s Gemini is now in everything. Here’s how you can try it out.

Gmail, Docs, and more will now come with Gemini baked in. But Europeans will have to wait before they can download the app.

This baby with a head camera helped teach an AI how kids learn language

A neural network trained on the experiences of a single young child managed to learn one of the core components of language: how to match words to the objects they represent.

Stay connected

Illustration by Rose Wong

Get the latest updates from
MIT Technology Review

Discover special offers, top stories, upcoming events, and more.

Thank you for submitting your email!

Explore more newsletters

It looks like something went wrong.

We’re having trouble saving your preferences. Try refreshing this page and updating them one more time. If you continue to get this message, reach out to us at customer-service@technologyreview.com with a list of newsletters you’d like to receive.