Researchers can now identify a person's emotion by looking at his or her brain activity. This is the first time that scientists have been able to analyze emotions, solely based on brain scans.

                                     

There are many hurdles in "mind reading" such as non-availability of reliable techniques, reluctance of participants to convey emotion and even participants' inability to consciously experience certain emotions.

Now, researchers at Carnegie Mellon University have shown that it is possible to "read" other people's emotions based on neural activity in the brain.

"This research introduces a new method with potential to identify emotions without relying on people's ability to self-report. It could be used to assess an individual's emotional response to almost any kind of stimulus, for example, a flag, a brand name or a political candidate," said Karim Kassam, assistant professor of social and decision sciences and lead author of the study.

The study was conducted on a group of actors because researchers needed people who could easily jump from one emotional state to another.

The participants were shown words of nine emotions: anger, disgust, envy, fear, happiness, lust, pride, sadness and shame and were asked to enter these emotional states multiple times in random order while they were inside an fMRI scanner.

To get the emotions per se and not the induced ones, researchers showed the participants unfamiliar pictures and analyzed the neural activity in the brain.

The computer model was constructed to analyze the fMRI activation patterns that could identify self-induced emotions. The model was based on statistical information and correctly identified the emotional content of the photos being viewed by the participant, using their neural activity pattern.

In the study, researchers used early scans of brain activity to study participants and compared it with their later scans to recognize the participants' emotional state. If the computer model picked an emotional state randomly, it would get a rank accuracy of 0.50. But in the study, the model got a rank accuracy of 0.84, showing that it had identified the actual emotional state of the viewer and wasn't just picking up a random emotion.

In another experiment, researchers applied machine learning analysis of neural activation patterns to all study participants except one. Here, they wanted to test whether the model could predict emotional state of a new person exposed to emotional stimuli. They found that the model got a rank accuracy of 0.71, still higher than random guessing.

"Despite manifest differences between people's psychology, different people tend to neurally encode emotions in remarkably similar ways," said Amanda Markey, a graduate student in the Department of Social and Decision Sciences in a press release.

The study is published in the journal PLOS One.