Menu Close
A blonde woman in a blue shirt sits as another woman with dark curly hair adjusts a green device on her head.
A woman tries out neurotechnology equipment during Tech Week in Bucharest, Romania, in May 2023. Cristian Cristel/Xinhua via Getty Images

New neurotechnology is blurring the lines around mental privacy − but are new human rights the answer?

Neurotechnologies – devices that interact directly with the brain or nervous system – were once dismissed as the stuff of science fiction. Not anymore. Several companies are developing and some are even testing “brain-computer interfaces,” or BCIs, of which the most high-profile is likely Elon Musk’s Neuralink. He announced on Jan. 29, 2024, that the first human in the company’s clinical trials has received a brain implant.

Like other companies, Neuralink’s immediate goal is to improve autonomy for patients with severe paralysis or other neurological disorders.

But not all BCIs are envisioned for medical use: There are EEG headsets that sense electrical activity inside the wearer’s brain covering a wide range of applications, from entertainment and wellness to education and the workplace. Yet, Musk’s ambitions go beyond these therapeutic and nonmedical uses. Neuralink aims to eventually help people “surpass able-bodied human performance.”

Neurotechnology research and patents have soared at least twentyfold over the past two decades, according to a United Nations report, and devices are getting more powerful. Newer devices have the potential to collect data from the brain and other parts of the nervous system more directly, with higher resolution, in greater amounts and in more pervasive ways.

However, these improvements have also raised concerns about mental privacy and human autonomy – questions I think about in my research on the ethical and social implications of brain science and neural engineering. Who owns the generated data, and who should get access? Could this type of device threaten individuals’ ability to make independent decisions?

In July 2023, the U.N. agency for science and culture held a conference on the ethics of neurotechnology, calling for a framework to protect human rights. Some critics have even argued that societies should recognize a new category of human rights, “neurorights.” In 2021, Chile became the first country whose constitution addresses concerns about neurotechnology.

Advances in neurotechnology do raise important privacy concerns. However, I believe these debates can overlook more fundamental threats to privacy.

A glimpse inside

Concerns about neurotechnology and privacy focus on the idea that an observer can “read” a person’s thoughts and feelings just from recordings of their brain activity.

It is true that some neurotechnologies can record brain activity with great specificity: for example, developments on high-density electrode arrays that allow for high-resolution recording from multiple parts of the brain.

Someone standing outside the frame adjusts a glowing monitor hooked up to a computer.
Paradromics, an Austin-based company, is developing a brain-computer interface to aide disabled and nonverbal patients with communication. Julia Robinson for The Washington Post via Getty Images

Researchers can make inferences about mental phenomena and interpret behavior based on this kind of information. However, “reading” the recorded brain activity is not straightforward. Data has already gone through filters and algorithms before the human eye gets the output.

Given these complexities, my colleague Daniel Susser and I wrote an article in the American Journal of Bioethics – Neuroscience asking whether some worries around mental privacy might be misplaced.

While neurotechnologies do raise significant privacy concerns, we argue that the risks are similar to those for more familiar data-collection technologies, such as everyday online surveillance: the kind most people experience through internet browsers and advertising, or wearable devices. Even browser histories on personal computers are capable of revealing highly sensitive information.

It is also worth remembering that a key aspect of being human has always been inferring other people’s behaviors, thoughts and feelings. Brain activity alone does not tell the full story; other behavioral or physiological measures are also needed to reveal this type of information, as well as social context. A certain surge in brain activity might indicate either fear or excitement, for example.

However, that is not to say there’s no cause for concern. Researchers are exploring new directions in which multiple sensors – such as headbands, wrist sensors and room sensors – can be used to capture multiple kinds of behavioral and environmental data. Artificial intelligence could be used to combine that data into more powerful interpretations.

Think for yourself?

Another thought-provoking debate around neurotechnology deals with cognitive liberty. According to the Center for Cognitive Liberty & Ethics, founded in 1999, the term refers to “the right of each individual to think independently and autonomously, to use the full power of his or her mind, and to engage in multiple modes of thought.”

More recently, other researchers have resurfaced the idea, such as in legal scholar Nita Farahany’s book “The Battle for Your Brain.” Proponents of cognitive liberty argue broadly for the need to protect individuals from having their mental processes manipulated or monitored without their consent. They argue that greater regulation of neurotechnology may be required to protect individuals’ freedom to determine their own inner thoughts and to control their own mental functions.

A man in a gray turtleneck stands with what looks like a black and white bike helmet on his head.
Seung Wan Kang, founder and CEO of iMediSync Inc., displays the company’s iSyncWave, which allows people to measure their brainwaves at home, at CES 2023 in Las Vegas. Ethan Miller/Getty Images

These are important freedoms, and there are certainly specific features – like those of novel BCI neurotechnology and nonmedical neurotechnology applications – that prompted important questions. Yet I would argue that the way cognitive freedom is discussed in these debates sees each individual person as an isolated, independent agent, neglecting the relational aspects of who we are and how we think.

Thoughts do not simply spring out of nothing in someone’s head. For example, part of my mental process as I write this article is recollecting and reflecting on research from colleagues. I’m also reflecting on my own experiences: the many ways that who I am today is the combination of my upbringing, the society I grew up in, the schools I attended. Even the ads my web browser pushes on me can shape my thoughts.

How much are our thoughts uniquely ours? How much are my mental processes already being manipulated by other influences? And keeping that in mind, how should societies protect privacy and freedom?

I believe that acknowledging the extent to which our thoughts are already shaped and monitored by many different forces can help set priorities as neurotechnologies and AI become more common. Looking beyond novel technology to strengthen current privacy laws may give a more holistic view of the many threats to privacy, and what freedoms need defending.

This is an updated version of an article originally published on Aug. 7, 2023.

Want to write?

Write an article and join a growing community of more than 182,600 academics and researchers from 4,945 institutions.

Register now