Google Glass app will map your face to detect your emotions
2013 09 05

By Elizabeth Leafloor | Red Ice Creations

A facial recognition app has been created for the Google Glass platform which maps your face and detects your emotions based on your expressions.

One of the proposed uses of the tech is to assist autistic people, who often are unable to recognize facial emotional expressions, in identifying and memorizing certain displayed emotions.


The new app is described by DVice:
In recent years, a number of advances in technology have served to help those with autism better understand the rest of the world in terms of emotional connections. Just last year an Ohio mother with an autistic son launched a smartphone app designed to help those with autism train themselves to recognize certain emotions. Another developer has now come up with a similar way to assist the autistic community through the interactive lens of Google Glass.

Announced this week by Catalin Voss and Jonathan Yan, Sension is a company formed to distribute face-tracking apps that map a human face and work to detect the emotion being displayed on that face. While the company is exploring a number of uses for the technology, including serving as an educational aid harnessed through the wearable device framework of Glass, the technology could be particularly useful for autistic users.



Affective computing is "the study and development of systems and devices that can recognize, interpret, process, and simulate human affects" such as emotion. Once computers recognize the range of human emotion, they might be better able to simulate empathy - for better or for worse.

[Read: I hear dead people! "Voice-cloning tech gives new life to silenced greats"]

Until now, such emotion and behaviour-tracking tech has mostly existed within the realm of military and government departments like Homeland Security. As reported in 2009 by AlterNet, "Homeland Security Embarks on Big Brother Programs to Read Our Minds and Emotions":
This past February, the Department of Homeland Security (DHS) awarded a one-year, $2.6 million grant to the Cambridge, MA.-based Charles Stark Draper Laboratory to develop computerized sensors capable of detecting a person’s level of "malintent" -- or intention to do harm. It’s only the most recent of numerous contracts awarded to Draper and assorted research outfits by the U.S. government over the past few years under the auspices of a project called "Future Attribute Screening Technologies," or FAST. It’s the next wave of behavior surveillance from DHS and taxpayers have paid some $20 million on it so far.

Conceived as a cutting-edge counter-terrorism tool, the FAST program will ostensibly detect subjects’ bad intentions by monitoring their physiological characteristics, particularly those associated with fear and anxiety. It’s part of a broader "initiative to develop innovative, non-invasive technologies to screen people at security checkpoints," according to DHS.

The "non-invasive" claim might be a bit of a stretch. A DHS report issued last December outlined some of the possible technological features of FAST, which include "a remote cardiovascular and respiratory sensor" to measure "heart rate, heart rate variability, respiration rate, and respiratory sinus arrhythmia," a "remote eye tracker" that "uses a camera and processing software to track the position and gaze of the eyes (and, in some instances, the entire head)," "thermal cameras that provide detailed information on the changes in the thermal properties of the skin in the face," and "a high resolution video that allows for highly detailed images of the face and body … and an audio system for analyzing human voice for pitch change."



Private companies have also been developing software to detect emotion with the goal of maximizing efficiency, reducing cost, and enhancing ’customer service’.
Less than two minutes into a cell phone conversation, a new computer program can predict a broken heart -- literally and figuratively.

An Israeli company called eXaudios has developed a computer program, known as Magnify, that decodes the human voice to identify a person’s emotional state.

Some companies in the United States already use the system in their call centers. eXaudios is even testing the software’s use in diagnosing medical conditions like autism, schizophrenia, heart disease and even prostate cancer.

"When agents talk with customers over the phone, they usually focus on content and not intonation, unless the customer is screaming," said Yoram Levanon, President and CEO of eXaudios, which recently won a $1 million prize at the Demo 2010 conference. "If a customer is screaming, you don’t need the software. But if we can identify the other emotions of a customer, we can save customers and companies money."

A number of companies sell software that analyzes conversations between a customer service agent and a customer after the conversation is over. Magnify monitors a phone call in real time. The program then lists the caller’s emotions on screen.

[...]

Magnify is not 100 percent accurate, however. Between 17 percent and 24 percent of the time Magnify fails to identify a caller’s correct emotions.
Source

Errors are unavoidable in this imperfect tech, and it’s not a stretch to think that apps such as ’Sension’ or the Department of Homeland Security’s FAST software, could make incorrect judgements of emotion, behaviour, and intent.


As these technologies inevitably advance and see spreading use, the public must stay vigilant and involved in order to ensure they don’t become misused tools of control under the cover of educational/disability aids, or ’enhanced’ security.

By Elizabeth Leafloor, Red Ice Creations




Related Articles
Mood-tracking app paves way for pocket therapy
Google Glass Will Track Your Gaze, Patent Hints
International officials demand privacy answers on Google Glass
Glasses Fool Facial Recognition Software
Google Glass: Let the evil commence
A look at the darker side of Google Glass
App developer hopes to use Google Glass to help the blind see
Stop the Cyborgs: The anti-Google Glass movement
Computer correctly identifies emotions for the first time
Expressing negative emotions could extend lifespan
New TVs will watch you and record your emotions
Should Socio-Emotional Learning Be Taught in Schools?


Latest News from our Front Page

Sweden is so deranged, there are no words to describe it – Danish politician
2014-12-19 1:53
Politically correct Sweden is such a deranged society that there is no term in social science sufficient enough to describe it – argues Pia Kjærsgaard the Danish People’s Party former leader In this article in Den Korte Avis, Kjærsgaard marvels at the fact that the politically correct are so desperate that they have graduated from using worn-out epiteths such as neo-fascists, ...
U.N. sending thousands of Muslims to America
2014-12-19 0:44
The federal government is preparing for another “surge” in refugees and this time they won’t be coming illegally from Central America. The U.S. State Department announced this week that the first major contingent of Syrian refugees, 9,000 of them, have been hand-selected by the United Nations for resettlement into communities across the United States. The announcement came Tuesday on the State Department’s ...
Why You Should Watch Beheading Videos
2014-12-17 21:13
Modernity is a dream. It is a dream that we can make the world something other than what it is through technology, prosperity, and above all “education.” People will able to pick their own “gender.” Different cultures, races, and religions can blend into one seamless, corporate-friendly whole. And violence and death, it is implied, will be wished out of existence. Today, a ...
CIA torture is reason for France to exit NATO – Le Pen
2014-12-17 20:20
The shocking revelations of CIA torture techniques give France a reason to exit NATO, National Front party leader Marine Le Pen said on Saturday. The report on the CIA’s former interrogation practices has drawn wide criticism since its release. “If indeed everyone is outraged by the tortures used by the US then, let’s leave NATO,” Le Pen said during an interview ...
900 Documented Examples of Obama's Lawbreaking, Lying, Corruption, Cronyism, Hypocrisy & Waste
2014-12-17 20:00
Our friend, Dan from Squirrel Hill, has updated his list to 900 documented examples of Barack Obama's lying, lawbreaking, corruption, cronyism, etc. He began with 252 examples, then quickly grew to 504 and later to 694. Recently, he added more than 200 more examples of the usurper-in-chief's unlawful acts. Every President, every politician, and every human being tells lies and engages ...
More News »