Google Glass app will map your face to detect your emotions
2013-09-05 0:00

By Elizabeth Leafloor | Red Ice Creations

A facial recognition app has been created for the Google Glass platform which maps your face and detects your emotions based on your expressions.

One of the proposed uses of the tech is to assist autistic people, who often are unable to recognize facial emotional expressions, in identifying and memorizing certain displayed emotions.


The new app is described by DVice:
In recent years, a number of advances in technology have served to help those with autism better understand the rest of the world in terms of emotional connections. Just last year an Ohio mother with an autistic son launched a smartphone app designed to help those with autism train themselves to recognize certain emotions. Another developer has now come up with a similar way to assist the autistic community through the interactive lens of Google Glass.

Announced this week by Catalin Voss and Jonathan Yan, Sension is a company formed to distribute face-tracking apps that map a human face and work to detect the emotion being displayed on that face. While the company is exploring a number of uses for the technology, including serving as an educational aid harnessed through the wearable device framework of Glass, the technology could be particularly useful for autistic users.



Affective computing is "the study and development of systems and devices that can recognize, interpret, process, and simulate human affects" such as emotion. Once computers recognize the range of human emotion, they might be better able to simulate empathy - for better or for worse.

[Read: I hear dead people! "Voice-cloning tech gives new life to silenced greats"]

Until now, such emotion and behaviour-tracking tech has mostly existed within the realm of military and government departments like Homeland Security. As reported in 2009 by AlterNet, "Homeland Security Embarks on Big Brother Programs to Read Our Minds and Emotions":
This past February, the Department of Homeland Security (DHS) awarded a one-year, $2.6 million grant to the Cambridge, MA.-based Charles Stark Draper Laboratory to develop computerized sensors capable of detecting a personís level of "malintent" -- or intention to do harm. Itís only the most recent of numerous contracts awarded to Draper and assorted research outfits by the U.S. government over the past few years under the auspices of a project called "Future Attribute Screening Technologies," or FAST. Itís the next wave of behavior surveillance from DHS and taxpayers have paid some $20 million on it so far.

Conceived as a cutting-edge counter-terrorism tool, the FAST program will ostensibly detect subjectsí bad intentions by monitoring their physiological characteristics, particularly those associated with fear and anxiety. Itís part of a broader "initiative to develop innovative, non-invasive technologies to screen people at security checkpoints," according to DHS.

The "non-invasive" claim might be a bit of a stretch. A DHS report issued last December outlined some of the possible technological features of FAST, which include "a remote cardiovascular and respiratory sensor" to measure "heart rate, heart rate variability, respiration rate, and respiratory sinus arrhythmia," a "remote eye tracker" that "uses a camera and processing software to track the position and gaze of the eyes (and, in some instances, the entire head)," "thermal cameras that provide detailed information on the changes in the thermal properties of the skin in the face," and "a high resolution video that allows for highly detailed images of the face and body Ö and an audio system for analyzing human voice for pitch change."



Private companies have also been developing software to detect emotion with the goal of maximizing efficiency, reducing cost, and enhancing ícustomer serviceí.
Less than two minutes into a cell phone conversation, a new computer program can predict a broken heart -- literally and figuratively.

An Israeli company called eXaudios has developed a computer program, known as Magnify, that decodes the human voice to identify a personís emotional state.

Some companies in the United States already use the system in their call centers. eXaudios is even testing the softwareís use in diagnosing medical conditions like autism, schizophrenia, heart disease and even prostate cancer.

"When agents talk with customers over the phone, they usually focus on content and not intonation, unless the customer is screaming," said Yoram Levanon, President and CEO of eXaudios, which recently won a $1 million prize at the Demo 2010 conference. "If a customer is screaming, you donít need the software. But if we can identify the other emotions of a customer, we can save customers and companies money."

A number of companies sell software that analyzes conversations between a customer service agent and a customer after the conversation is over. Magnify monitors a phone call in real time. The program then lists the callerís emotions on screen.

[...]

Magnify is not 100 percent accurate, however. Between 17 percent and 24 percent of the time Magnify fails to identify a callerís correct emotions.
Source

Errors are unavoidable in this imperfect tech, and itís not a stretch to think that apps such as íSensioní or the Department of Homeland Securityís FAST software, could make incorrect judgements of emotion, behaviour, and intent.


As these technologies inevitably advance and see spreading use, the public must stay vigilant and involved in order to ensure they donít become misused tools of control under the cover of educational/disability aids, or íenhancedí security.

By Elizabeth Leafloor, Red Ice Creations




Related Articles
Mood-tracking app paves way for pocket therapy
Google Glass Will Track Your Gaze, Patent Hints
International officials demand privacy answers on Google Glass
Glasses Fool Facial Recognition Software
Google Glass: Let the evil commence
A look at the darker side of Google Glass
App developer hopes to use Google Glass to help the blind see
Stop the Cyborgs: The anti-Google Glass movement
Computer correctly identifies emotions for the first time
Expressing negative emotions could extend lifespan
New TVs will watch you and record your emotions
Should Socio-Emotional Learning Be Taught in Schools?


Latest News from our Front Page

"Racist" Facts White People Daren't Talk About
2015-05-07 3:55
Police brutality targeting blacks will not subside until this becomes part of the national conversation "RACIST" FACT: Despite making up just 13% of the population, blacks commit around half of homicides in the United States. "RACIST" FACT: Blacks commit eight times more crimes against whites than vice-versa. "RACIST" FACT: Between 1999 and 2011, 2,151 WHITES died as a result of being shot by ...
Norway's sneaky arms exports to Israel
2015-05-06 22:57
It may come as a shock that Norway, the home of the Nobel Peace Prize and the Oslo accords, is funneling weapons to Israel. Palestine is second only to Afghanistan in the amount of aid it receives from the Norwegian government. Last year, Norway’s aid to Palestine totaled nearly $100 million. Palestine solidarity activists stage a die-in outside the Norwegian parliament in ...
TV2 Denmark Documentary on HPV Vaccine Shows Lives of Young Women Ruined
2015-05-06 21:17
TV2 Denmark has done something no mainstream media network in the United States will dare to do: look into the controversial HPV vaccine that many have claimed has ruined the lives of so many young women, and publish an investigative report. In December of 2013 Katie Couric did a show on the HPV Gardasil Vaccine where she dared to interview the ...
Unique historic color video shows Berlin in July 1945
2015-05-06 5:38
New footage has emerged of Berlin in the aftermath of World War II. The video, filmed in July 1945, shows famous landmarks like the Brandenburg Gate and the Reichstag in ruins as ordinary citizens try to go about their everyday lives. The video, shot around two months after the city fell in 1945, shows the utter destruction the German capital underwent ...
Universities Don't Understand Safe Spaces
2015-05-06 1:07
Youtube description: Apparently people don't understand why allowing people to grow up in a perpetual hug box is bad for them. This article's writer does give me some faith in humanity though. Source: youtube.com
More News »