The robot that reads your mind to train itself
2010 10 25
By Lakshmi Sandhana | BBCNews
Rajesh Rao is a man who believes that the best type of robotic helper is one who can read your mind.
In fact, he’s more than just an advocate of mind-controlled robots; he believes in training them through the power of thought alone.
His team at the Neural Systems Laboratory, University of Washington, hopes to take brain-computer interface (BCI) technology to the next level by attempting to teach robots new skills directly via brain signals.
Robotic surrogates that offer paralyzed people the freedom to explore their environment, manipulate objects or simply fetch things has been the holy grail of BCI research for a long time.
Dr Rao’s team began by programming a humanoid robot with simple behaviours which users could then select with a wearable electroencephalogram (EEG) cap that picked up their brain activity.
The brain generates what is known as a P300, or P3, signal involuntarily, each time it recognizes an object. This signal is caused by millions of neurons firing together in a synchronised fashion.
This has been used by many researchers worldwide to create BCI-based applications that allow users to spell a word, identify images, select buttons in a virtual environment and more recently, even play in an orchestra or send a Twitter message.
The team’s initial goal was for the user to send a command to the robot to process into a movement.
However, this requires programming the robot with a predefined set of very basic behaviours, an approach which Dr Rao ultimately found to be very limiting.
The team reasoned that giving the robot the ability to learn might just be the trick to allow a greater range of movements and responses.
"What if the user wants the robot to do something new?" Dr Rao asked.
The answer, he said, was to tap into the brain’s "hierarchical" system used to control the body.
"The brain is organised into multiple levels of control including the spinal cord at the low level to the neocortex at the high level," he said.
"The low level circuits take care of behaviours such as walking while the higher level allows you to perform other behaviours.
"For example, a behaviour such as driving a car is first learned but later becomes an almost autonomous lower level behaviour, freeing you to recognize and wave to a friend on the street while driving."
To emulate this kind of behaviour - albeit in a more simplistic fashion - Dr Rao and his team are developing a hierarchical brain-computer interface for controlling the robot.
"A behaviour initially taught by the user is translated into a higher-level command. When invoked later, the details of the behaviour are handled by the robot," he said.
A number of groups worldwide are attempting to create thought-controlled robots for various applications.
Early last year Honda demonstrated how their robot Asimo could lift an arm or a leg through signals sent wirelessly from a system operated by a user with an EEG cap.
Scientists at the University of Zaragoza in Spain are working on creating robotic wheelchairs that can be manipulated by thought.
Designing a truly adaptive brain-robot interface that allows paralysed patients to directly teach a robot to do something could be immensely helpful, liberating them from the need to use a mouse and keyboard or touchscreen, designed for more capable users.
Using BCIs can also be a time-consuming and clumsy process, since it takes a while for the system to accurately identify the brain signals.
"It does make good sense to teach the robot a growing set of higher-level tasks and then be able to call upon them without having to describe them in detail every time - especially because the interfaces I have seen using... brain input are generally slower and more awkward than the mouse or keyboard interfaces that users without disabilities typically use," says Robert Jacob, professor of computer science at Tufts University.
Rao’s latest robot prototype is "Mitra" - meaning "friend". It’s a two-foot tall humanoid that can walk, look for familiar objects and pick up or drop off objects. The team is building a BCI that can be used to train Mitra to walk to different locations within a room.
Once a person puts on the EEG cap they can choose to either teach the robot a new skill or execute a known command through a menu.
In the "teaching" mode, machine learning algorithms are used to map the sensor readings the robot gets to appropriate commands.
If the robot is successful in learning the new behaviour then the user can ask the system to store it as a new high-level command that will appear on the list of available choices the next time.
"The resulting system is both adaptive and hierarchical - adaptive because it learns from the user and hierarchical because new commands can be composed as sequences of previously learned commands," Dr Rao says.
The major challenge at the moment is getting the system to be accurate given how noisy EEG signals can be.
"While EEG can be used to teach the robot simple skills such as navigating to a new location, we do not expect to be able to teach the robot complex skills that involve fine manipulation, such as opening a medicine bottle or tying shoelaces" says Rao.
It may be possible to attain a finer degree of control either by utilising an invasive BCI or by allowing the user to select from videos of useful human actions that the robot could attempt to learn.
A parallel effort in the same laboratory is working on imitation-based learning algorithms that would allow a robot to imitate complex actions such as kicking a ball or lifting objects by watching a human do the task.
Dr Rao believes that there are very interesting times ahead as researchers explore whether the human brain can truly break out of the evolutionary confines of the human body to directly exert control over non-biological robotic devices.
"In some ways, our brains have already overcome some of the limitations of the human body by employing cars and airplanes to travel faster than by foot, cell phones to communicate further than by immediate speech, books and the internet to store more information than can fit in one brain," says Rao.
"Being able to exert direct control on the physical environment rather than through the hands and legs might represent the next step in this progression, if the ethical issues involved are adequately addressed."
Article from: bbc.co.uk
Austrian with high-tech robot arm dies after crash
Eerie female robot learns to ’sing’ by copying human singer
Robots Learning How Not to Hurt Humans - By Punching Them
Robot learns to shoot bow and arrow
The real 2001: Scientists teach robots how to trick humans
’World’s first’ Arabic-speaking robot
First robot able to develop and show emotions is unveiled
World’s creepiest robot? Japanese inventor develops the bald, legless Telenoid
Emotiv: Mind Reading Device
Robots Integrate Selves Into Japanese Society
Latest News from our Front Page
The Aeon of Horus is Ending and the Elites are Nervous as their Icons are Dying
2014 04 18
I predict there is going to be a huge resurgence of interest in European indigenous spiritual traditions from Norse to Celtic/Gaelic to Slavic and so on. Millions of Europeans are going to realise that we are the victims of Christianity and New Age garbage. Their bastardised Kabbalah, the psychic force used by Crowley and the elites to cement his Aeon ...
Easter - Christian or Pagan?
2014 04 18
Contrary to popular belief, Easter does not represent the "historical" crucifixion and resurrection of Jesus Christ. In reality, the gospel tale reflects the annual "crossification" of the sun through the vernal equinox (Spring), at which time the sun is "resurrected," as the day begins to become longer than the night.
Rather than being a "Christian" holiday, Easter celebrations date back ...
Man-Made Blood Might Be Used in Transfusions by 2016
2014 04 18
Researchers in the U.K. have created the first man-made red blood cells of high enough quality to be introduced into the human body
The premise of the HBO show and book series True Blood revolves around a technological breakthrough: scientists figure out how to synthesize artificial human blood, which, as an ample new source of non-human food, allows vampires to "come ...
The Trials of the Cherokee Were Reflected In Their Skulls
2014 04 18
Researchers from North Carolina State University and the University of Tennessee have found that environmental stressors – from the Trail of Tears to the Civil War – led to significant changes in the shape of skulls in the eastern and western bands of the Cherokee people.
The findings highlight the role of environmental factors in shaping our physical characteristics.
Our Fears May Be Shaped by Ancestral Trauma
2014 04 18
Last December, an unsettling Nature Neuroscience study found that mice who were taught to associate the smell of cherry blossoms with pain produced offspring who feared the smell of cherry blossoms, even if they had never been exposed to it before. We knew that the process was epigenetic—that it was not hard-wired in the permanent genetic structure of the mouse—but ...
|More News » |