Mar 31 2010

Magnets and Morality

MIT researchers have published a paper in which they use transcranial magnetic stimulation to alter, or bias, the moral judgments made by test subjects. Many people have e-mailed me this story with the comment that it seems fishy to them. Using magnets and changing morality so simply triggered their skeptical detectors, which is reasonable. But in this case the research seems perfectly legitimate, although some of the reporting has been superficial or dubious.


The whole notion of using magnets as a biological or medical intervention has long been exploited by the dubious magnetic therapy industry. I do agree that there are many fraudulent or quacky magnetic devices out there with unscientific claims. I do not recommend that people buy magnetic insoles for their shoes or strap refrigerator magnets to their joints to relieve pain or promote healing.

However, transcranial magnetic stimulation (TMS) is different. The brain is an exquisitely electrical organ, allowing for electromagnetic interventions to its function. We have recently developed the technology to use strong magnetic fields, precisely tuned and focused, to either stimulate or inhibit the electrical activity in brain tissue.

I recently wrote about a study that showed that such an intervention could potentially help relieve migraine headaches. TMS is also being used to treat depression – a more modern manifestation of the old electric shock treatment (ECT). Implantable stimulators are being used to treat seizures and symptoms of Parkinson’s disease. It is likely in the future we will be using many more electromagnetic interventions for neurological conditions.

So don’t be put off by the whole magnet thing – in this context it is legitimate.

Changing Morality

The other aspect of this story that triggered many people’s BS detectors is the notion of changing moral choices by simply affecting one small part of the brain. This aspect of the study was misrepresented in many of the mainstream reports that I saw. The discussion on the MIT website itself is much better.

The brain region in question, the right temporo- parietal junction (TPJ) is involved with a function known as the theory of mind – the ability to imagine what some other creature is thinking or feeling. In other words, we understand that other people have their own thoughts and feelings similar to our own , and this enables us to consider their possible motivations.

This has obvious survival advantages, especially for such a social creature such as humans.

The theory of mind is also critical to making moral judgments. Most moral judgments are strongly connected to the intention of the person, not just the final outcome. It is the theory of mind, and therefore the TPJ that enables us to consider a person’s intention in arriving at a moral judgment.

For example, the study in question posed to subjects the following scenario: Two people are visiting a chemical plant. Person A asks person B to get them a coffee. Person B does so, and puts a substance in person A’s coffee that is labeled “poison.” It turns out the substance was just sugar, and person A is fine.

This scenario is designed to separate out intention from outcome. Subjects were asked to rate the behavior of person B from a moral perspective, rating it from totally forbidden to permissible. When the TPJ was inhibited by TMS (but not a nearby area) subjects were more likely to find the behavior permissible, because there was no bad outcome. This suggests that their theory of mind was impaired and their judgments therefore were less based upon intention.

This research is a follow up to earlier research where fMRI scans were used to see what brain regions are active when making such moral judgments. This new evidence helps confirm the prior association and show probable causation.


This research shows how fMRI and TMS are being used in a complementary way to determine the functional anatomy of the cortex – even the most abstract intellectual functions. These technologies are still tricky to use, however, and I always emphasize that I would consider all such results preliminary until they are independently replicated.

Also, regarding moral judgments, the researchers emphasize that there is no one “moral center” in the brain, but rather moral judgments involve coordinating multiple different types of processing in the brain and bringing it all together for a final decision. It is literally a balancing of many small assessments and then deciding what the net result is.

What this research shows us is the role of the theory of mind in weighing the intention of a moral actor. We consider not only the outcome of people’s actions, but what they intended. It is fascinating that there is a specific neurological function that underlies these moral judgments.

19 responses so far