The field of neuroscience has been animated recently by the use of Functional Magnetic Resonance Imaging, or fMRI. When a person lies in an fMRI machine, scientists can see their brain activity in real time. It’s a species of mind reading that promises to unlock the still mysterious workings of our grey matter.
In April, a team in Japan announced that they could identify when a subject was dreaming about different types of objects like a house, a clock, or a husband. Last November, another group of researchers using this technique was able to predict if gadget columnist David Pogue was thinking about a skyscraper or a strawberry.
What earlier studies couldn’t determine, however, was how the subjects were actually feeling. A new study released today by Carnegie Mellon University, which also draws on fMRI, represents the first time researchers have been able to map people’s emotional state based on their neural activity.
"Emotion is a critical part of our lives, but scientifically speaking, it’s been very hard to pin down," said Karim Kassam, an assistant professor of social and decision sciences and the lead author of the study. The gold standard for understanding how people feel has been, quite simply, to ask them. "But if someone is embarrassed by sexually exciting stimulus or knows their views on racial matters are outside the norm, then this kind of self reporting breaks down."
Led by researchers in CMU’s Dietrich College of Humanities and Social Sciences, the study had a group of actors look at words like anger, disgust, envy, fear, happiness, lust, pride, sadness and shame. As they did so, the actors tried to bring themselves to this emotional state. Their brains were monitored by fMRI and a computer modeled the results.
Based on these scans, the computer model could then correctly guess the emotion of the actors when they were shown a series of evocative photos. Each emotion essentially had a neural signature. The patterns of brain activity the computer learned were not limited to those individuals. Based on the scans of the actor’s brains, the computer model could correctly identify the emotions of a new test subject who had not participated in the earlier trials.
The fact that science is as usual so heedlessly researching an area with so many possible risks for civil liberties and individual rights can’t be called reassuring. The possible applications of this technology are horrific. Nobody ever seems to consider the possibilities of using science against humanity.
Genetic science is a case in point. Genetic information can be used to classify people in negative ways. In psychiatry, even blushing is considered a “mental disorder” by somebody prepared to formally classify it. What are the possibilities for a mind reading machine?
Will someone be required to submit to an emotional test to get a job, or for analysis for a court case? Can seeing your prospective boss be a basis for rejecting a job applicant, when “disgust” is recognised as the emotion being experienced? (You have to wonder how many job applicants could be feeling emotionally positive in that environment.) Source
Identifying Emotions on the Basis of Neural Activation
Recent Israeli Synagogue Attack, a Possible False Flag? 2014 11 21 Dear Friends - I woke up yesterday morning to see a newspaper lying on the kitchen table with the front page proclaiming that five people were slain in an Israeli synagogue after a so-called "Palestinian attack." Some members of the media said that four people were killed, others said five, so it seems like that there was some confusion (or ...
Detekt: A New Malware Detection Tool That Can Expose Illegitimate State Surveillance 2014 11 21 Recent years have seen a boom in the adoption of surveillance technology by governments around the world, including spyware that provides its purchasers the unchecked ability to target remote Internet users’ computers, to read their personal emails, listen in on private audio calls, record keystrokes and passwords, and remotely activate their computer’s camera or microphone. EFF, together with Amnesty International, ...
New UK spy chief says tech giants aid terrorism, privacy not ‘absolute right’ 2014 11 21
Robert Hannigan, the new head of GCHQ
The new head of Britain’s GCHQ, the UK equivalent of the NSA in the U.S., said he believes privacy is not an absolute right and that tech giants must open themselves up to intelligence agencies.
“GCHQ is happy to be part of a mature debate on privacy in the digital age,” Hannigan said. “But privacy ...
LOL: Atheist Feminist Pornographer Used as Moral Authority in T-shirt Row 2014 11 21
Dr. Matt Taylor was thrust into the headlines this last week, largely for his lead role in successfully landing a spacecraft on a comet 300 million miles from earth that travels at a speed of 85,000 mph. In short, Taylor and his colleagues pulled off one of the most amazing achievements in contemporary science and space exploration, and in a ...