November 21, 2024:
When Rafael Yuste was a teenager growing up in Madrid in the 1970s, he had two great loves: the brain and the piano.
At age 15, he started working at a clinical laboratory run by his mom, where he was tasked with looking through a microscope and counting blood cells. He read the autobiography of 19th century Spanish neuroscientist Santiago Ramón y Cajal, which recounted how the famous researcher spent nights hunched over tissue samples in his basement, unraveling the mysteries of the brain as part of a grand quest to help humanity. The romantic depiction of a scientist’s life captured young Yuste’s imagination.
And then there was music. Yuste played the piano and dreamed of being a great composer. So he signed up to study at the Royal Conservatory of Music. In parallel, he entered medical school.
He didn’t realize it then, but his two great loves would end up intertwining: He did become a professional piano player — except that the piano is the brain.
“Imagine that the brain is like a piano keyboard,” Yuste, now a neuroscientist at Columbia University, told me. “Every neuron is a key. And with light, we can play the keys.”
Yuste is referring to a method called optogenetics, which he’s used on mice as part of an effort to understand how the brain produces perception and behavior. It involves making a genetic change to neurons, usually by infecting them with a virus that makes them responsive to light. Then you can use a laser to activate specific neurons in the visual cortex of the mouse’s brain. That causes the mouse to see things that aren’t really there.
“When you activate the neuron with light and make it fire, this is what we call ‘playing the piano,’” Yuste told me. The phrase is now used in scientific papers as a shorthand for the technique. When he pioneered it on mice a decade ago, he was shocked at how easily he could manipulate their visual perception. Whenever he made images artificially appear in their brains, the mice behaved as though the images were real.
“From the point of view of science, it was fantastic, because we finally broke into the brain — we broke the code that the neurons are using,” he recounted. “During the day, we were super happy, congratulating each other in the lab. But at night, I didn’t sleep.”
It was the first inkling he had that he’d one day need to be involved in creating laws to stop people from manipulating each other’s brains. He’d created the mouse version of the movie Inception. And mice are mammals, with brains similar to our own. How long, he wondered, until someone tries to break into the human brain?
“That was when I had my Oppenheimer moment,” Yuste said, referring to the nuclear physicist who helped design the nuclear bomb. “You know like in the movie, when Oppenheimer looks at the first nuclear reactor and says, ‘Holy moly, look what I’ve done,’ and you can see his face turning sour?”
In fact, Yuste had a strange connection to Oppenheimer. Every morning, when the neuroscientist walked into his office on Columbia’s Morningside campus, he gazed out the window. It looked out onto a historic piece of the Manhattan Project: the very building in which American scientists first prototyped the nuclear reactor during World War II.
The view from his window kindled his own horror at what he’d built — and sparked his resolve to make sure the tech is properly regulated. After all, the physicists who built the bomb included many who would go on to lobby the US government and the United Nations to regulate nuclear energy. That led to the 1957 establishment of the International Atomic Energy Agency, which still helps keep nations from blowing each other up with nuclear bombs.
So, in 2017, Yuste gathered around 30 experts to meet at Columbia, where they spent days unpacking the ethics of neurotechnology. He wasn’t worried about forms of it that need to be surgically implanted in the brain — those are medical devices, subject to federal regulations. But he worried about the unregulated, noninvasive neurotech increasingly coming out on the consumer market, which records your neural data (like the Muse meditation headband, which uses EEG sensors to read patterns of activity in your brain). That can then be sold to third parties and used to find out if someone has a disease like epilepsy even when they don’t want that information disclosed. Perhaps it could even be used one day to identify individuals against their will.
And as Yuste’s mouse experiments showed, it’s not only mental privacy that’s at stake; there’s also the risk of someone using neurotechnology to directly manipulate our minds. While some neurotechnologies only aim to “read” what’s happening in your brain, others also aim to “write” to the brain — that is, to directly change what your neurons are up to.
The group of experts convened by Yuste, now known as the Morningside Group, published a Nature paper later that year making four policy recommendations, which Yuste later expanded to five. Think of them as new human rights for the age of neurotechnology:
But Yuste wasn’t content to just write academic papers. “I’m a person of action,” he told me. “It’s not enough to just talk about a problem. You have to do something about it.” So he connected with Jared Genser, an international human rights lawyer who has represented clients like the Nobel Peace Prize laureates Desmond Tutu and Aung San Suu Kyi. Together, Yuste and Genser created the Neurorights Foundation to advocate for the cause.
They soon notched a major win. In 2021, after Yuste helped craft a constitutional amendment with a close friend who happened to be a Chilean senator, Chile became the first nation to enshrine the right to mental privacy and the right to free will in its national constitution. Then the Brazilian state of Rio Grande do Sul passed its own constitutional amendment (with a federal constitutional amendment in the works). Mexico, Uruguay, Colombia, and Argentina are considering similar laws.
As for the US, Yuste has advocated for neurorights at the White House and in Congress. He knows, though, that change on the federal level will be a grueling mountain to climb. In the meantime, there’s an easier approach: States that already have comprehensive privacy laws can amend them to cover mental privacy.
With Yuste’s help, Colorado became the first US state to take that path, passing new legislation this year that amends its privacy law to include neural data. California has followed suit, passing legislation that brings brain data into the category of “sensitive personal information.”
If US federal law were to follow, neural data would fall under the protection of HIPAA, which Yuste said would prevent practically all of the potential privacy violations he’s worried about. Another possibility would be to get all neurotech devices recognized as medical devices so they would have to be approved by the FDA.
Ultimately, preventing a company from harvesting brain data in one state or even one country is of limited use if it can just do that elsewhere. The holy grail would be international legislation.
So, Yuste has been speaking to the United Nations, with some modest success so far. After meeting with Yuste, Secretary-General António Guterres mentioned neurotechnology in his 2021 report on the future of humanity, “Our Common Agenda.”
In his most ambitious dreams, Yuste wants a new global treaty on neurorights and a new international agency to make sure countries comply with it. He imagines the creation of something like the International Atomic Energy Agency, but for brains. It’s a bold vision from a modern-day Oppenheimer.