Google Glass app will map your face to detect your emotions
2013-09-05 0:00

By Elizabeth Leafloor | Red Ice Creations

A facial recognition app has been created for the Google Glass platform which maps your face and detects your emotions based on your expressions.

One of the proposed uses of the tech is to assist autistic people, who often are unable to recognize facial emotional expressions, in identifying and memorizing certain displayed emotions.

The new app is described by DVice:
In recent years, a number of advances in technology have served to help those with autism better understand the rest of the world in terms of emotional connections. Just last year an Ohio mother with an autistic son launched a smartphone app designed to help those with autism train themselves to recognize certain emotions. Another developer has now come up with a similar way to assist the autistic community through the interactive lens of Google Glass.

Announced this week by Catalin Voss and Jonathan Yan, Sension is a company formed to distribute face-tracking apps that map a human face and work to detect the emotion being displayed on that face. While the company is exploring a number of uses for the technology, including serving as an educational aid harnessed through the wearable device framework of Glass, the technology could be particularly useful for autistic users.

Affective computing is "the study and development of systems and devices that can recognize, interpret, process, and simulate human affects" such as emotion. Once computers recognize the range of human emotion, they might be better able to simulate empathy - for better or for worse.

[Read: I hear dead people! "Voice-cloning tech gives new life to silenced greats"]

Until now, such emotion and behaviour-tracking tech has mostly existed within the realm of military and government departments like Homeland Security. As reported in 2009 by AlterNet, "Homeland Security Embarks on Big Brother Programs to Read Our Minds and Emotions":
This past February, the Department of Homeland Security (DHS) awarded a one-year, $2.6 million grant to the Cambridge, MA.-based Charles Stark Draper Laboratory to develop computerized sensors capable of detecting a person’s level of "malintent" -- or intention to do harm. It’s only the most recent of numerous contracts awarded to Draper and assorted research outfits by the U.S. government over the past few years under the auspices of a project called "Future Attribute Screening Technologies," or FAST. It’s the next wave of behavior surveillance from DHS and taxpayers have paid some $20 million on it so far.

Conceived as a cutting-edge counter-terrorism tool, the FAST program will ostensibly detect subjects’ bad intentions by monitoring their physiological characteristics, particularly those associated with fear and anxiety. It’s part of a broader "initiative to develop innovative, non-invasive technologies to screen people at security checkpoints," according to DHS.

The "non-invasive" claim might be a bit of a stretch. A DHS report issued last December outlined some of the possible technological features of FAST, which include "a remote cardiovascular and respiratory sensor" to measure "heart rate, heart rate variability, respiration rate, and respiratory sinus arrhythmia," a "remote eye tracker" that "uses a camera and processing software to track the position and gaze of the eyes (and, in some instances, the entire head)," "thermal cameras that provide detailed information on the changes in the thermal properties of the skin in the face," and "a high resolution video that allows for highly detailed images of the face and body … and an audio system for analyzing human voice for pitch change."

Private companies have also been developing software to detect emotion with the goal of maximizing efficiency, reducing cost, and enhancing ’customer service’.
Less than two minutes into a cell phone conversation, a new computer program can predict a broken heart -- literally and figuratively.

An Israeli company called eXaudios has developed a computer program, known as Magnify, that decodes the human voice to identify a person’s emotional state.

Some companies in the United States already use the system in their call centers. eXaudios is even testing the software’s use in diagnosing medical conditions like autism, schizophrenia, heart disease and even prostate cancer.

"When agents talk with customers over the phone, they usually focus on content and not intonation, unless the customer is screaming," said Yoram Levanon, President and CEO of eXaudios, which recently won a $1 million prize at the Demo 2010 conference. "If a customer is screaming, you don’t need the software. But if we can identify the other emotions of a customer, we can save customers and companies money."

A number of companies sell software that analyzes conversations between a customer service agent and a customer after the conversation is over. Magnify monitors a phone call in real time. The program then lists the caller’s emotions on screen.


Magnify is not 100 percent accurate, however. Between 17 percent and 24 percent of the time Magnify fails to identify a caller’s correct emotions.

Errors are unavoidable in this imperfect tech, and it’s not a stretch to think that apps such as ’Sension’ or the Department of Homeland Security’s FAST software, could make incorrect judgements of emotion, behaviour, and intent.

As these technologies inevitably advance and see spreading use, the public must stay vigilant and involved in order to ensure they don’t become misused tools of control under the cover of educational/disability aids, or ’enhanced’ security.

By Elizabeth Leafloor, Red Ice Creations

Related Articles
Mood-tracking app paves way for pocket therapy
Google Glass Will Track Your Gaze, Patent Hints
International officials demand privacy answers on Google Glass
Glasses Fool Facial Recognition Software
Google Glass: Let the evil commence
A look at the darker side of Google Glass
App developer hopes to use Google Glass to help the blind see
Stop the Cyborgs: The anti-Google Glass movement
Computer correctly identifies emotions for the first time
Expressing negative emotions could extend lifespan
New TVs will watch you and record your emotions
Should Socio-Emotional Learning Be Taught in Schools?

Latest News from our Front Page

German Schoolchildren will Cook and Clean for “Refugees” as Part of Work Experience
2015-10-13 0:44
Schoolchildren from an undisclosed school in the city of Lübeck, Schleswig-Holstein, will spend a week doing housework for so-called “refugees”. Unknown Object. The idea was sold to parents at a parents’ meeting as a “practical internship“, but has sparked outrage online. A letter to the parents said their children will be going to “refugee” accommodations and will be making beds, sorting clothes, and ...
MTV: Saying "No Can Do" Or "Long Time No See" Is Racist
2015-10-12 23:04
Harmless terms have offensive origins, asserts Franchesca Ramsey According to MTV News, using the phrases “long time no see,” “peanut gallery” and “no can do” is offensive because the terms have “racist beginnings.” Host Franchesca Ramsey begins by claiming that the term “peanut gallery” is offensive because it was once the place where black people were “forced” to sit at the theater ...
Migrants Dump Garbage from Their Balconies at German Asylum Center
2015-10-12 23:35
German authorities expect up to 1.5 million asylum seekers to arrive in Germany this year – up from 750,000 last month. So they can expect more of this– Migrants Dump Garbage from their Balconies : Augsburg Asylum Center, Germany In a recent poll, the number of “frightened” Germans jumped from 38% to 51% in three weeks. Muslim Statistics reported, via Religion of Peace: The latest ...
US Paradrops 50 Tons Of Ammo To Syrian Rebels
2015-10-12 22:32
As we noted over the weekend, the US has now thrown in the towel on the ill-fated (and that’s putting it lightly) strategy of training Syrian fighters and sending them into battle only to be captured and killed by other Syrian fighters who the US also trained.  The Pentagon’s effort to recruit 5,400 properly “vetted” anti-ISIS rebels by the end of ...
Migrant Crisis stand-up routine - Sam Hyde
2015-10-10 18:07
YouTube description: "Migrant" "Crisis" AKA Muslim Road Trip. Hundreds of thousands of astronauts, doctors, sciencemen, and peaceful clock inventors descend upon the countries with the most collective guilt and free stuff. This was pretty good man... I wasn't expecting this reaction from the same Boston college audience who so thoroughly disapproved of my 'Mike Brown' routine. I think the invasion of ...
More News »