- Brain stimulation can influence parts of our brain that drive our morals and cognition.
- We can become fairer, smarter, more self-controlled, more positive, more caring, and even more transcendent if we tweak our so-called “virtues control panel.”
- There’s always the risk that these highly advanced tools fall into the wrong hands; the future is both frightening and exciting.
Would you ever consider implanting a “virtues control panel” inside your brain, one that could turn you into a fairer and more compassionate person? Granted, you’d have to let scientists send electrical currents down your brain’s neurons for about ten minutes each day, or consent to having a brain chip embedded inside your head. Take all the time you need to think it through, but know that the future is already here: in the last two decades, neuroscientists have consistently tampered with specific structures of our brain to gauge the effect on our moral code.
“A part of the argument relies on what’s called brain lateralization,” James J. Hughes—associate provost for the University of Massachusetts in Boston, and executive director of the university’s techno-progressive think tank, Institute for Ethics and Emerging Technologies—tells Popular Mechanics.
🧠 You love bleeding-edge tech. So do we. Let’s nerd out over it together—join Pop Mech Pro.
Brain lateralization is the view that different parts of the brain perform different functions; by turning up or turning down activity in those different sections, we can influence the way we think, feel, and behave, scientists believe. It’s this property of the brain that neuromodulation, the delivery of electrical or pharmaceutical agents specifically to an area of the body to affect or change nerves, could exploit to bring about a more morally and cognitively enhanced humanity. Hughes explains this concept in-depth in a chapter in the book Policy, Identity, and Neurotechnology: The Neuroethics of Brain‐Computer Interfaces.
We’ve been attempting to enhance our morality and cognition for centuries, but mostly through drugs. “Drugs like [synthetic] oxytocin increase the amount of trust we have for other people. Stimulant medications reduce fidgeting and increase our concentration on tasks. There are various studies showing drugs have moral consequences for our behavior,” says Hughes. However, drugs are a bit blunt in their approach and have the tendency to affect almost every system in our body.
How Does Neuromodulation Work?
Neuromodulation comes with a remarkable specificity, Hughes says. “You’re only targeting the part of the brain that needs change.”
We can do this either externally, by sending electrical currents or magnetic waves directly to parts of the brain from the outside—these are transcranial direct current stimulation and transcranial magnetic stimulation techniques, respectively. Or, we can opt for more invasive brain-computer interfaces. One example is vagus-nerve stimulation, which involves implanting electrodes into the vagus nerve, the longest and most complex of the cranial nerves that is heavily involved in lowering our blood pressure and heart rate, thus moderating our “fight-or-flight” system.
Another technique is deep brain stimulation (DBS), a neurosurgical procedure that places electrodes directly in the brain. The deeper the intervention, the greater the results. “External neuromodulation can focus to a centimeter of brain tissue while DBS can create an electric cascade influencing 100,000 neurons or more,” Hughes says. (This is still not that much stimulation if you consider that the human brain contains around 100 billion neurons and 100 trillion or so interconnections between them).
Miracles have happened when lab researchers tried to increase self-control, empathy, intelligence, and even spiritual experiences inside the lab. But what happens outside experimental settings, in the unpredictability of the real world? “While valid in controlled conditions, the effect of such modulation can lessen in real-world settings, whereby the combined activation of multiple neurocognitive networks function to affect how we think and respond to various environmental cues, in particular settings and circumstances,” James Giordano, professor of neurology and neuroethics at Georgetown University Medical Center in Washington, D.C., tells Popular Mechanics.
When we modify certain traits and tendencies to fit within a socio-cultural set of standards that are regarded as “right” or “good,” we are treading on thin ice, Giordano says. “What is morally ‘good’ may be another’s reality of what is seen as harmful, disruptive, and ‘bad,’” he explains. “There is no ‘moral circuit’ or ‘moral nucleus’ that can be turned on or off. Morality is a social construct,” he says.
Hughes couldn’t agree more. How we view moral enhancement will be the central conundrum of neuromodulation as it moves forward. Some intuitions we have inherited from our evolutionary past, like “thou shalt not kill,” are here to stay. Others, like nepotism, which is the instinct to prioritize your own family over the interests of others, have been disregarded since the Enlightenment, says Hughes. “The Taliban have a totally different ethical system than NATO,” he continues. We can find more common ground if we neuromodulate not only one single virtue, but a total of six, Hughes says in his 2022 paper. These are self-control, caring, intelligence, fairness, positivity, and transcendence.
Let’s say we want to create a fairer society. Fairness is the ability to treat others impartially and justly without exhibiting favoritism or discrimination. “A person who demonstrates racial bias and shows a lot of disgust or fear is governed a lot by their amygdala,” says Hughes. The amygdala is an almond-shaped mass of gray matter inside each cerebral hemisphere, and it is the principal generator of our “fight-or-flight” response system and subsequent conditioned fear. On the contrary, behind our eyes and forehead lies the prefrontal cortex, a structure responsible for our complex cognitive behavior and moderating our own social behavior. “The stronger your prefrontal cortex is in comparison to your amygdala, the more the fairness. You can either tell your amygdala to shut up or you can ignore the fact that your amygdala is driving your actions,” says Hughes.
What Is the Worst-Case Scenario?
What happens if these sophisticated tools fall into the wrong hands? Elon Musk’s brain chip firm Neuralink could conjure some of our worst dystopian nightmares: what if hackers gained ultimate control over someone’s brain? It would be like suicide for the human mind. Likewise, what happens if a current dictator uses DBS to create a super-army of soldiers programmed to commit the wildest atrocities without an inch of regret?
“We need to ensure that the technology application is itself moral,” Beena Ammanath, executive director of the Global Deloitte AI Institute, tells Popular Mechanics. “Any mature technology ready for public adoption should come with transparency and respect for privacy and people’s consent based on a full understanding of the technology and its potential impact on them,” Ammanath says.
Hughes, meanwhile, says the debate is political. “With almost all technologies, when people worry about them being used in fascistic or totalitarian ways, the real problem is fascism and totalitarianism. It’s not the tool,” he says.
Take, for example, the Comprehensive Test Ban Treaty (CTBT) of 1996. It was signed by 172 countries, which agreed to prohibit “any nuclear weapon test explosion or any other nuclear explosion anywhere in the world.” These are types of collective actions we could take to ensure our neuromoral enhancement doesn’t open a Pandora’s box, Hughes suggests. (Still, it is worth noting that even the CTBT needs the signatures of eight more countries to go into effect: China, Egypt, India, Iran, Israel, North Korea, Pakistan, and the United States, with the United States and China being the two greatest nuclear powers in the world).
When we decide to turn on and off the switches of our virtues control panel, we acknowledge that the self is “a useful illusion, an ensemble of various cognitive processes,” says Hughes. Nihilistic? Quite the opposite.
“By stepping out of what’s called the default mode network in the brain (which is the part of the brain that is associated with self-referential processes and anything related to ego), we are free to live in the moment and step out of the things that we feel certain about. We step out of the nature of reality that we feel certain about,” Hughes says.
Through that lens, the robotic-sounding virtues control panel sort of becomes a tool for viewing reality in a new, fascinating light.
This content is created and maintained by a third party, and imported onto this page to help users provide their email addresses. You may be able to find more information about this and similar content at piano.io