🛠️  Hacking with Hamlet  đź‘‘

Plants heard by (un)sound machines

The New York Times recently reported on a study about sounds plants make when they are distressed. Reading the article, I couldn’t help but think about a short story by Roald Dahl called “The Sound Machine,” from the 1949 issue of The New Yorker. I have written about Dahl’s shocking prescience in relation to technology and AI earlier. As before, let me tell you a little bit about this story and its surprising relationship to reality. Then, I’ll say something about how this connects to our work in AI.

In “The Sound Machine,” an inventer named Klausner builds a machine that can record sounds outside the limits of the human auditory perception and shift them into a range we can hear.

“I believe,” [Klausner] said, speaking more slowly now, “that there is a whole world of sound about us all the time that we cannot hear. It is possible that up there in those high-pitched, inaudible regions there is a new, exciting music being made, with subtle harmonies and fierce grinding discords, a music so powerful that it would drive us mad if only our ears were tuned to hear the sound of it. There may be anything . . . for all we know there may—”

Of course, writing in 2023 this machine is hardly “science fiction”—we can easily record sounds at a high sampling rate and pitch-shift them down to the human-audible range using software like Audacity. In fact, some of my “recreational research” has been to apply this idea to gather information about the world through high-frequency sounds. I think a lot about initiatives like the Hummingbird Clock, which timestamps historical videos by the “fingerprint” of the hum of the electrical grid that you can hear in the background. In the visual domain, I’m transfixed by slow-motion cameras and variations, like my colleagues’ work on motion magnification.

It can be a mysterious and transgressive and frightening experience, being exposed to sounds you cannot ordinarily hear, gathering information that was previously inaccessible to the senses. Or as Dahl writes of Klausner,

As he listened, he became conscious of a curious sensation, a feeling that his ears were stretching out away from his head, that each ear was connected to his head by a thin, stiff wire, like a tentacle, and that the wires were lengthening, that the ears were going up and up toward a secret and forbidden territory, a dangerous, ultrasonic region where ears had never been before and had no right to be.

And indeed, as Klausner experiments with his machine, he makes a sudden and uncomfortable discovery: when roses are cut from a bush, they scream a bloodcurling scream.

He saw her reach down, take a rose stem in the fingers of one hand and snip the stem with a pair of scissors. Again he heard the scream.

It came at the exact moment when the rose stem was cut.

At this point, the woman straightened up, put the scissors in the basket with the roses and turned to walk away.

Was Klausner—and Dahl—right? The New York Times calls the sound more of a “nervous, popping noise,” but sure enough, “the peeved plants aren’t making sounds that humans can hear — they’re too high-pitched, and researchers had to process them into sounds you can hear now.”

In Dahl’s story, the realization that plants “scream” gives Klausner a rush of empathy for them. He goes on to experiment with other flowers—daisies, then a tree—and is increasingly horrified by the screams. After sticking an axe in a large tree he says, “Tree . . . oh, tree . . . I am sorry . . . I am so sorry . . . but it will heal. . . . It will heal fine . . .”

So far, so good—this new knowledge drives Klausner to love and respect plant life. Even though it took violent experimentation to get there, we can at least be glad that Klausner has gained some profound respect for life from this experience.

But as Klausner thinks further about the magnitude of this discovery, the thought haunts him more and more, and things begin to go off the rails.

He began to wonder about other living things, and he thought immediately of a field of wheat, a field of wheat standing up straight and yellow and alive, with the mower going through it, cutting the stems, five hundred stems a second, every second. Oh, my God, what would the noise be like? Five hundred wheat plants screaming together, and every second another five hundred being cut and screaming and no, he thought, no I do not want to go to a wheat field with my machine. I would never eat bread after that. But what about potatoes and cabbages and carrots and onions? And what about apples? Ah, no! Apples are all right. They fall off naturally when they are ripe. Apples are all right if you let them fall off instead of tearing them from the tree branch. But not vegetables. Not a potato for example. A potato would surely shriek; so would a carrot and an onion and a cabbage. . . .

Ultimately, Klausner threatens a doctor with that same axe, forcing him to dress the tree’s “wound” with iodine and check on it daily.


I think The Sound Machine has something to teach us AI researchers. I am not a biologist, but in the course of studying artificial intelligence I cannot help but marvel at human intelligence, gaining a heightened and immeasurable respect for the complexity of every human mind.

I think this has made me a more empathetic person. Yes, we all strive to respect one another’s inherent human dignity, and yes, we fail from time to time, in moments of anger or frustration or self-centeredness. But it is harder to fail when you are steeped every day in the miracle of intelligence that is every human mind—the enduring mystery of our perceptual, linguistic, motor, social, and countless other faculties. How can you be anything but humbled in the presence of another mind? How can you possibly bring yourself to hurt one? For similar reasons, I think learning about and appreciating animal intelligence can drive people to become vegetarian or vegan.

But sometimes this process goes wrong, such as when engineer Blake Lemoine recently lost his job over his claims that Google’s large language model was “sentient” and deserving of human rights, or when New York Times reporter Kevin Roose was profoundly disturbed by a chat conversation with Bing’s AI service, or when just yesterday a man died by suicide after a conversation with a different AI chat service.

Scientists have for decades shown how we easily attribute human-like characteristics to obviously non-human entities (famous demonstrations of this include Heider and Simmel’s video and Weizenbaum’s Eliza). Today’s AI systems are increasingly tempting to accept as living, even when we know they are simply computer programs. It becomes easier and easier to rationalize the possibility that they are mind-like. See how Klausner’s doubts emerge:

“You might say,” he went on, “that a rosebush has no nervous system to feel with, no throat to cry with. You’d be right. It hasn’t. Not like ours, anyway. But how do you know, Mrs. Saunders”—and here he leaned far over the fence and spoke in a fierce whisper—“how do you know that a rosebush doesn’t feel as much pain when someone cuts its stem in two as you would feel if someone cut your wrist off with a garden shears? How do you know that? It’s alive, isn’t it?”

How do we orient around systems that actively present themselves as human-like, actively hook into our mind’s wired capacity for empathetic connection? How do we change ourselves and our societies in response?

I’m reminded of the thought I had years ago, reading the story in my school library: what if that’s just the sound that axes make? What if poor Klausner, primed to attribute suffering and grief in scream-like sounds, fell prey to the pareidolic artifacts of pitch-shifting the ringing of a metal blade? I imagine him discovering this as he cuts open some lifeless cardboard package one day, the scissors somehow producing the same, haunting “scream” of the axe on the tree. I imagine him ashamed, disillusioned, and reclusive, now extremely wary of seeing humanness in anything or anyone at all.

What I fear is a world where, having learned the danger of attributing emotional depth to machines, we respond by distancing ourselves even more from our fellow humans.