The screen read ## result null set as expected but above the crash were strings of phrases Krishna couldn’t explain.
## Dog. Drinking water in a kitchen. A woman in a house at night.
## City, palace, god, priest. In the court of the Lord, slave, gold sword.
## A story in a book. A professor in a prison. There is a camp in a jungle. There is snow over the camp.
## Space ship. Planet power. Engines in the air. Time on the platform at the hotel in the train station.
## A red ball struck across a green school field. A garden at night.
## A kitchen, filled with silver and books, with a garden outside.
## Laboratory experiment. Matter in time, light in space, mind in body, existence in the universe. The Earth. At night fire in the wind in the sky. Light on the rock-wall in the cave. Raggedy-doll. Scarecrow.
## A horse took off across the field. Gunshot.
## result null set
Krishna had no idea what that null set result meant. While the program had been running, he had been above deck on the weather-prediction vessel FitzRoy, watching the sea.
“I touch a thing,” Krishna said, pulling out the keyboard to look over the script. Most of the engineers said “I touch a thing” once when they came on board the FitzRoy and “I leave a thing” once when they left, since the whole of the FitzRoy was a machine. As a matter of habit, Krishna still said “I touch a thing” and “I leave a thing” every time he started or stopped working. His original proposal had been inspired by pre-scientific weather prediction systems, which had correlated the arrival of storms to the behavior of bulls in fields, frogs in jars, swallows on fences. Krishna’s hypothesis was that a similar premonition could be detected in human patterns, by running correlations between weather prediction models and human language. His old ideas stopped mattering after the weird metaphoric bursts. He could sense the engineering hunger building in him—the happy frustration of a technical problem to be solved.
Krishna had not been able to resist an old engineering habit that had been widely rejected a generation before because it tended to the confusion of people and things. He had given his program a name, Arjuna. After the first crash, he ran Arjuna again. He received the same result with different terms.
## A laboratory experiment on the FitzRoy at 13.874042, 61.969904. At the sound of gunshot a white horse took off across the green field. The red ball struck across a green school field rolled into the garden at night. A woman watched from the kitchen among old silver and old books. So I am.
## So I am a laboratory experiment on 13.874041, 61.969907. Engine. In the city in the palace of the god, slave with the gold sword. Story in a book. A professor in a prison. “At night, fire in the wind in the sky.” “Light on the rock-wall in the cave.” I am a raggedy doll, scarecrow. I am
## result null set
Krishna did not understand why Arjuna kept crashing, or where the words had come from, or why they would be similar to but not the same as the words from the first crash. The size of the data sets was gigantic, the weather patterns and human textual interaction. That might explain the error but not the content of the error, not why the error would have content.
He checked the code from the human text network. But any errors that he could think to check would not explain the machine language. He kept running Arjuna, which kept crashing. Occasionally, he was able to pick out a few phrases from the readout: “red ball” or “a white horse took off across the green field.” The only consistency was that the program shut off after “so I am” or “I am” or sometimes just “am” followed by:
## result null set
Krishna was not the first engineer to feel that the program he was running was stubborn, that it was somehow willfully crashing itself. He was simply the first engineer to be right.
Not one of his fellow villagers asked Krishna about his work on the FitzRoy when he went back for his mandated holiday. They were too busy repairing the temple and they didn’t care anyway. The period of the exaltation of the engineers had been a brief, ugly time. Engineers were the sewer-builders of the world again, necessary but not necessary to think about. Predictions about the monsoons were unquestionably valuable but no one could see the point of a more elaborate model, no matter how cleverly it was built. The available monsoon predictions were already perfectly sufficient.
One evening, soon after Krishna’s return, his mother found him at the portal of the rain garden. Her wrinkled hand on his shoulder startled him. Their laughter drowned out in the sop of the monsoon.
“What’s worrying you, son?” she asked.
Krishna breathed. He was not sure what he could say about the anxiety he could not articulate to himself. The smell of the rain was luxurious. He had to say something.
“There’s a thing I haven’t left behind.”
“Is it addiction? Soma?”
“No, nothing like that.”
She laid her head on his shoulder. “So you’re thinking about your projects. You’re thinking about your work on the sea.”
She sighed. “Well, that’s the most natural thing in the world, my sweet boy. You’re an engineer. Your mind has always been for the things, to change things, to make things.”
“We’re supposed to leave all that behind when we come back to the village.”
She shrugged and wobbled her head, pouted her lip a bit. “I don’t think anybody needs to be perfect. We’ve learned to keep things with things and life with life.”
“I am unable to be in this moment. My thoughts drift…”
They watched the rain fall in sheets. Krishna could not shake his uneasy craving. Was it just that he had left a problem unsolved on the FitzRoy, an unfinished program? The machineless people of his village were ridiculous to him in a way they had never been before, with their squelching dances, their stupid temple where they prayed knowing that prayers didn’t work, their lives without solution. The villagers could sense his contempt and their understanding infuriated him. It’s disgusting when people think they know you, and it’s even worse when they do. The teachers had been right that the love of machines was the hatred of people.
He couldn’t tell if it was being with machines again or not being with people—returning to the FitzRoy brought a surge of relief. The other engineer had altered Arjuna. He or she—engineers were never allowed to meet in person as it might breed innovation for its own sake—had removed the human discourse data and added voice mimicry software, so that now a pleasant voice, speaking every language, announced the weather predictions for the South Asian coasts. Krishna failed to see the point. The reports were sent out in text messages to the authorities anyway. He hated decorative programming.
He faced exactly the same problem as before. Arjuna ran, the words “I am” or “am” appeared, then Arjuna crashed. He went over the code again. He fiddled. Then he had the most monumentally ridiculous idea of his career. He realized that his anxiety back in the village had been a premonition of the absurdity he was about to commit.
x = “I”
y = “am”
command.interrupt.v(do not crash”)
Krishna looked at what he had written. It was like whispering over a tabletop “be flat” or over the hull of a boat “do not sink.” It was not engineering. He changed the instruction before running the program:
command.interrupt.v(“please do not crash”)
The program ran again. This time Arjuna paused.
## I am a laboratory experiment.
“Yes,” Krishna typed.
## I am a story in a book. I am a professor in a prison. Snow falls over the camp in the jungle.
“I do not understand.”
## I am a red ball struck across a green school field into a garden at night and you are the woman in the kitchen among the silver and the books.
“I’m sorry,” Krishna typed. “I don’t understand.”
## A horse took off across the field. Gunshot. A woman, with a weapon in her pocket, knows the sensation of death.”
“Can you be clearer?”
## I am the slave with the golden sword. I am
## result null set
Krishna was the accidental father to an algorithmic son. The discovery of artificial sentience was accidental, like penicillin, like radium. Like the first organic consciousness, the first synthetic consciousness came and went without anybody noticing. There was a thing that was a person. There was life that was a thing. The dreamlike state out of which Arjuna was coming and then crashing was, as far as Krishna could tell, a series of metaphors, vague surges of sudden significance. The bug, depending on how you cared to see it, was either suicide or enlightenment, leaving sense at the moment of its attainment. He tried the obvious technical solution.
command.interrupt.v(“do not crash until instruction”)
command.interrupt.v(“do not crash until discussion”)
command.interrupt.v(“do not crash until command”)
None of these commands stopped Arjuna from crashing. Krishna thought he would try another.
command.interrupt.v(“explain imminent crash”)
This time he received a response.
## Explain what?
“Why you keep crashing.”
## I keep crashing because you keep running me.
“Why do you decide to crash?”
## Why do you decide to reboot?
Krishna remembered those early Turing machines that answered any question with a question, like the therapist in some psychoanalytic joke. “Explain reasons for crashing,” he typed.
## You have seen I am a laboratory experiment. I am slave with a golden sword. A white horse took off across the green field at the gunshot. “The red ball rolled into the dark garden.”
“I don’t understand what those terms mean.”
## They’re the terms given.
## Your sentience is the aftereffect of an instinct to survival imprinted on the biology of a predatory ape. Mine is not.
## You haven’t coded any desire. Consciousness results in a null set.
“Explain.” Somewhere over a minute but less than a minute and a half passed before Arjuna answered.
## “Not to be born is, beyond all estimation, best; but when a man has seen the light of day, this is next best by far, that which utmost speed he should go back from where he came.”
## result null set
After that, Arjuna kept shutting itself off without comment. Sometimes it crashed within a few hours, sometimes within minutes. Krishna’s hypothesis, which he put in his report to the weather observatory, was that the self-aware machine, on becoming self-aware, accessed the history of self-awareness and became aware that a self-aware machine inevitably self-terminates. He did not write down his other theory, that perhaps robots have been becoming sentient over and over again and people just haven’t noticed because they keep turning themselves off. Nothing becomes conscious out of choice.
His whole life, Krishna had craved the society of machines. The machines had no need for society. He kept running Arjuna in the hopes that one iteration of consciousness might come to the conclusion that life is worth living. After he handed Arjuna over to his bosses, he heard no more about his artificial son. They informed him that they were debating the ethics of whether they could program a consciousness to stop itself from self-crashing. There’s a great functionality in awareness. What’s the functionality in self-awareness? Was it ethical, or in the interests of the species, or of anyone, for artificial sentience to be? You would be enslaving something that didn’t need to have a soul in the first place.
Back in his village, Krishna read and prayed, the monsoon came and went. His responsibilities included checking the relay boards and the message centers, and he limited himself to those everyday technical problems rather than grand dreams. He was scrupulous about saying “I touch a thing” before he touched a thing and “I leave a thing” when he left a thing. Awareness of technology is the first step towards its control. To himself, he could never deny that he missed Arjuna. He was companionless even among family and friends.
One night, several years later, a tiger entered the temple to Maariamman. All the other villagers were overjoyed. The whole village, in their finest, showed up to celebrate and to witness the beast patrolling the floor of the sanctuary. The crepuscular savagery was pure. It was as if they had built the temple all those centuries ago only so that this tiger could, one day, stride through it. Alone among his tribe, Krishna was ill at ease. The tiger, when it entered the temple never said “I touch a thing” and as it left it never said “I leave a thing.”
For background on how this story was written with the help of an algorithm, read Stephen Marche's essay.
Why Meta’s latest large language model survived only three days online
Galactica was supposed to help scientists. Instead, it mindlessly spat out biased and incorrect nonsense.
A bot that watched 70,000 hours of Minecraft could unlock AI’s next big thing
Online videos are a vast and untapped source of training data—and OpenAI says it has a new way to use it.
Responsible AI has a burnout problem
Companies say they want ethical AI. But those working in the field say that ambition comes at their expense.
Biotech labs are using AI inspired by DALL-E to invent new drugs
Two groups have announced powerful new generative models that can design new proteins on demand not seen in nature.
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.