According to Nita A. Farahany, a professor of law and philosophy at Duke University, the day will come when companies know what products we want before we do. They’ll get the information directly from our brains.
If that sounds like science fiction, consider that L’Oréal, the world’s largest cosmetics company, has already partnered with neurotechnology company Emotiv to change the way people shop for perfume. You wear an EEG headset, machine learning algorithms interpret your neural response to different scents, and a consultant makes “personalized” and “precise” fragrance recommendations.
That use case may sound benign, and other devices that decode our brain waves could have benefits for our health and well-being. But because neurotech can reveal highly personal information, ranging from our desires to our political beliefs, it might be possible to use it in neuromarketing that invades our privacy and manipulates us. In her new book, “Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology,” Farahany argues that we can’t afford to wait until neurotechnology further advances before confronting its risks.
Farahany believes that the way forward is to reinvigorate our commitment to freedom and update our understanding of what is required to ensure it. She is calling for expanding ethics and the law to include a right to “cognitive liberty,” which would safeguard our brains and mental processes.
My interview with her has been edited and condensed.
What is neurotechnology?
Neurotechnology is any technology that can help access, decode, or change what’s happening in the human brain. For example, companies are marketing neurotech devices that can detect if drivers are sleepy, and this could improve safety.
You anticipate that the neurotechnology market will be worth $21 billion by 2026. What’s being developed?
There are different kinds of neurotechnology. For example, there’s neurotechnology that can be implanted in the human brain, such as a brain-computer interface that enables people with quadriplegia to type messages on computers using their thoughts. But what I’m most interested in and think will go mainstream is consumer neurotechnology or brain wearables. They’ll be embedded in our headphones, earbuds, and even software that goes into our Apple Watches and picks up our heart rate and associates it with brain activity. This technology can help us relax and focus, among other things.
You recommend that society adopt a new human right of cognitive liberty. What is it?
Freedom of thought, the right to mental privacy, and the right to self-determination — these are all existing human rights. But they have yet to be interpreted to address this coming age of brain transparency. The right to cognitive liberty would be a right to self-determination over our brains and mental experiences, and we would update those rights as neurotechnology continues to accelerate.
If neurotechnology can reveal or predict personal medical information, should medical professionals control it?
People shouldn’t have to go a doctor every time they want to learn something about their brain or well-being. If someone with epilepsy wants to know if a seizure is coming, they should have direct access to that information. To ensure safety, regulators should require manufacturers to accurately represent what their neurotech devices can do, like decoding levels of fatigue or focus or identifying different health risks. Once that happens, people should not have to go through a “trusted intermediary” to access them.
Should we be allowed to use neurotechnology to enhance our minds as we see fit? Or should society impose limits to prevent cheating and limit inequality?
Many people think that using cognitive enhancers is cheating. But I think it’s very difficult to draw a line between the ordinary cup of coffee that most people have in the morning to wake up their brains and a device like a brain wearable that could improve a person’s focus and attention through neurofeedback. So not only do I think people should be able to get neurotech devices that can enhance their brains in different ways, but I also don’t think it’s cheating when they choose to use them. Continuously expanding our potential and experiences is fundamental to being human.
Should we be allowed to use neurotechnology to spare ourselves from suffering? For example, by erasing unpleasant memories?
I believe people have a right to self-determination over their brains and mental experiences — that they have a right to make choices that affect their own health and well-being. Not only should people be able to erase unpleasant memories, but they should also have a right to choose how, whether, and the extent to which they wish to suffer or continue suffering.
That said, the scenario can involve other considerations. For example, if a person witnesses a crime, does society have a right to access the information about what they saw before they erase the memory? Does a criminal defendant have a right to cross-examine the person before they erase their memory? So, I’m not advocating for an unfettered right to self-determination. But on balance, we should favor an individual’s right to self-determination unless the societal interest is strong enough to warrant otherwise.
Should employers be allowed to use neurotechnology to surveil workers?
Many employers are already surveilling workers to find out what they’re thinking and feeling by doing things like tracking their keystrokes and monitoring their webcams when they’re working from home on work-issued computers. Workplace surveillance is about to get much more precise by using brain sensors.
If an employer is only monitoring whether a commercial driver or a pilot is tired or awake, and the only information that they’re getting from a person’s brain is something like a fatigue level from one to five, I think, on balance, that the mental privacy interest of the employee is not as strong as the safety interests of society for the employer to have access to that limited piece of information. In many ways, it may be less intrusive than the kinds of monitoring they would otherwise do, like using a camera inside a trucker’s cab to figure out if a person is awake or sleepy.
Do you think we’ll need neurotechnology to give us a competitive edge against AI?
If anybody has played with ChatGPT, they should know the answer to that question is most likely yes! We’re going to need every advantage that we can get. Generative AI like ChatGPT can quickly produce answers to questions, including false ones; synthesize information; and propose new ideas. How much more efficiently can it do many tasks that we thought were uniquely human? For us to be competitive, many things will need to change in society, including how we can enhance our capabilities. We’ll need to consider brain-to-brain communication or brain-to-brain collaboration. Working on projects and ideas together may not only be desirable but necessary to keep up with the kinds of AI we’ve already created.
Can you elaborate?
If you’ve ever tried to communicate an idea to another person, you’ve realized it’s deeply challenging to give them a complete sense of how you see a problem or how you would work through it. Neurotechnology introduces the possibility that rather than filtering our thoughts and feelings through spoken or visual images, we could more directly share them. And that possibility is important because, to date, AI has been trained on our words and images. If we could communicate with one another in the full resolution that’s in our minds, it may be far beyond the capabilities of any AI we’ve created or ever will create.
Does anything protect us from the government using neurotechnology to access our thoughts?
I spent a lot of my early career scholarship trying to figure out if, at least within the United States, our existing constitutional protections safeguard us against the government using neurotechnology to peer inside our brains. I looked at our right against self-incrimination under the Fifth Amendment, our right against unreasonable search and seizure under the Fourth Amendment, our right to freedom of speech under the First Amendment, and the different privacy laws that exist across the country. At least at the federal level, there’s nothing in existing legislation or constitutional law that protects us against the government using technology to interrogate our minds. There are a couple of states that have some relevant laws. But if we’re looking at the Constitution, we are most likely out of luck.
Evan Selinger is a professor of philosophy at the Rochester Institute of Technology; an affiliate scholar at Northeastern University’s Center for Law, Innovation, and Creativity; and a scholar in residence at the Surveillance Technology Oversight Project. Follow him on Twitter @evanselinger.