The robot that reads your mind to train itself
By Lakshmi Sandhana | BBCNews
Rajesh Rao is a man who believes that the best type of robotic helper is one who can read your mind.
In fact, he’s more than just an advocate of mind-controlled robots; he believes in training them through the power of thought alone.
His team at the Neural Systems Laboratory, University of Washington, hopes to take brain-computer interface (BCI) technology to the next level by attempting to teach robots new skills directly via brain signals.
Robotic surrogates that offer paralyzed people the freedom to explore their environment, manipulate objects or simply fetch things has been the holy grail of BCI research for a long time.
Dr Rao’s team began by programming a humanoid robot with simple behaviours which users could then select with a wearable electroencephalogram (EEG) cap that picked up their brain activity.
The brain generates what is known as a P300, or P3, signal involuntarily, each time it recognizes an object. This signal is caused by millions of neurons firing together in a synchronised fashion.
This has been used by many researchers worldwide to create BCI-based applications that allow users to spell a word, identify images, select buttons in a virtual environment and more recently, even play in an orchestra or send a Twitter message.
The team’s initial goal was for the user to send a command to the robot to process into a movement.
However, this requires programming the robot with a predefined set of very basic behaviours, an approach which Dr Rao ultimately found to be very limiting.
The team reasoned that giving the robot the ability to learn might just be the trick to allow a greater range of movements and responses.
"What if the user wants the robot to do something new?" Dr Rao asked.
The answer, he said, was to tap into the brain’s "hierarchical" system used to control the body.
"The brain is organised into multiple levels of control including the spinal cord at the low level to the neocortex at the high level," he said.
"The low level circuits take care of behaviours such as walking while the higher level allows you to perform other behaviours.
"For example, a behaviour such as driving a car is first learned but later becomes an almost autonomous lower level behaviour, freeing you to recognize and wave to a friend on the street while driving."
To emulate this kind of behaviour - albeit in a more simplistic fashion - Dr Rao and his team are developing a hierarchical brain-computer interface for controlling the robot.
"A behaviour initially taught by the user is translated into a higher-level command. When invoked later, the details of the behaviour are handled by the robot," he said.
A number of groups worldwide are attempting to create thought-controlled robots for various applications.
Early last year Honda demonstrated how their robot Asimo could lift an arm or a leg through signals sent wirelessly from a system operated by a user with an EEG cap.
Scientists at the University of Zaragoza in Spain are working on creating robotic wheelchairs that can be manipulated by thought.
Designing a truly adaptive brain-robot interface that allows paralysed patients to directly teach a robot to do something could be immensely helpful, liberating them from the need to use a mouse and keyboard or touchscreen, designed for more capable users.
Using BCIs can also be a time-consuming and clumsy process, since it takes a while for the system to accurately identify the brain signals.
"It does make good sense to teach the robot a growing set of higher-level tasks and then be able to call upon them without having to describe them in detail every time - especially because the interfaces I have seen using... brain input are generally slower and more awkward than the mouse or keyboard interfaces that users without disabilities typically use," says Robert Jacob, professor of computer science at Tufts University.
Rao’s latest robot prototype is "Mitra" - meaning "friend". It’s a two-foot tall humanoid that can walk, look for familiar objects and pick up or drop off objects. The team is building a BCI that can be used to train Mitra to walk to different locations within a room.
Once a person puts on the EEG cap they can choose to either teach the robot a new skill or execute a known command through a menu.
In the "teaching" mode, machine learning algorithms are used to map the sensor readings the robot gets to appropriate commands.
If the robot is successful in learning the new behaviour then the user can ask the system to store it as a new high-level command that will appear on the list of available choices the next time.
"The resulting system is both adaptive and hierarchical - adaptive because it learns from the user and hierarchical because new commands can be composed as sequences of previously learned commands," Dr Rao says.
The major challenge at the moment is getting the system to be accurate given how noisy EEG signals can be.
"While EEG can be used to teach the robot simple skills such as navigating to a new location, we do not expect to be able to teach the robot complex skills that involve fine manipulation, such as opening a medicine bottle or tying shoelaces" says Rao.
It may be possible to attain a finer degree of control either by utilising an invasive BCI or by allowing the user to select from videos of useful human actions that the robot could attempt to learn.
A parallel effort in the same laboratory is working on imitation-based learning algorithms that would allow a robot to imitate complex actions such as kicking a ball or lifting objects by watching a human do the task.
Dr Rao believes that there are very interesting times ahead as researchers explore whether the human brain can truly break out of the evolutionary confines of the human body to directly exert control over non-biological robotic devices.
"In some ways, our brains have already overcome some of the limitations of the human body by employing cars and airplanes to travel faster than by foot, cell phones to communicate further than by immediate speech, books and the internet to store more information than can fit in one brain," says Rao.
"Being able to exert direct control on the physical environment rather than through the hands and legs might represent the next step in this progression, if the ethical issues involved are adequately addressed."
Article from: bbc.co.uk
Austrian with high-tech robot arm dies after crash
Eerie female robot learns to ’sing’ by copying human singer
Robots Learning How Not to Hurt Humans - By Punching Them
Robot learns to shoot bow and arrow
The real 2001: Scientists teach robots how to trick humans
’World’s first’ Arabic-speaking robot
First robot able to develop and show emotions is unveiled
World’s creepiest robot? Japanese inventor develops the bald, legless Telenoid
Emotiv: Mind Reading Device
Robots Integrate Selves Into Japanese Society
Latest News from our Front Page
Galaxy Poll: 86 per cent of Australians want childhood vaccination to be compulsory?
Australians want Prime Minister Tony Abbott to make childhood vaccination compulsory and close loopholes that allow vaccine refusers to put all children at risk.
An exclusive national Galaxy poll commissioned by The Sunday Telegraph has revealed overwhelming support to ensure every child is vaccinated.
The highest support for compulsory jabs is in South Australia, where 90 per cent support the call.
The poll ...
Eye in the sky: Local police now using drones to spy on citizens
The Harris County Precinct 1 Constable's Office is doing something that no other agency in Harris County is believed to have done yet: Use drones to help fight crime.
It's an eye in the sky for law enforcement, without giving up the element of surprise.
"It could absolutely save lives," says Constable Alan Rosen.
Rosen says the agency's two new $1,200 drones, which ...
New Zealander of the Year: refuse vaccines, lose money
Following in the footsteps of Australia, 2014 New Zealander of the Year, Dr. Lance Oâ€™Sullivan, wants to punish people who donâ€™t get vaccinated.
The New Zealand Herald (4/15) reports:
â€śA leading New Zealand doctor has called on the Government to follow Australiaâ€™s example to cut child welfare payments to families who do not vaccinate their children, saying the policy would help protect ...
Iris Scanner Identifies a Person 40 Feet Away
Police traffic stops are in the news again, tragically, sparking a new round of discussion on whether and how to outfit police with cameras and other technology.
For several years now, researchers at Carnegie Mellon Universityâ€™s CyLab Biometrics Center have been testing an iris recognition system that can be used to identify subjects at a range of up to 40 feet.
Yes, You Can Catch Insanity
One day in March 2010, Isak McCune started clearing his throat with a forceful, violent sound. The New Hampshire toddler was 3, with a Beatles mop of blonde hair and a cuddly, loving personality. His parents had no idea where the guttural tic came from. They figured it was springtime allergies.
Soon after, Isak began to scream as if in pain ...
|More News » |