The Next Frontier In Computing: Your Brain
President Barack Obama is introduced by Dr. Francis Collins, Director, National Institutes of Health, at the BRAIN Initiative event in the East Room of the White House, April 2, 2013. (Official White House Photo by Chuck Kennedy)
At a recent conference IBM researchers unveiled a new computer architecture in the hope of creating a computer as powerful as the human brain. Using simulations of “enormous complexity,” the researchers have created an architecture, named TrueNorth, that could lead to a new generation of machines that function more like biological brains than traditional computers.
An article on the MIT Technology Review website describes the opportunity to use the TrueNorth cognitive computing architecture writing, “to develop systems as powerful as human vision. The brain sorts through more than one terabyte of visual data each day but requires little power to do so.”
The technique was developed by researcher Dharmendra S. Modha and his team at IBM Research – Almaden in San Jose, California. Dr. Modha is the founder of IBM’s Cognitive Computing group and leads a global team focused on the intersection of neuroscience, nanoscience and supercomputing. His team is attempting to build computing systems that emulate the brain’s abilities for perception, action, and cognition – all while consuming many orders of magnitudes less power and space than today’s computers. In 2009, his group received ACM’s Gordon Bell Prize for its research into cortical simulations at scale.
Yet another area of cognitive computing study is focused on the Boltzmann machine, an algorithm invented by Geoffrey Hinton and Terry Sejnowski in 1983. The Boltzmann Machine is capable of learning the underlying constraints that characterize a domain simply by being shown examples from the domain. Or in more simple terms, it can “interpret & recognize patterns” in much the same way our brain does, with or without context.
Quanta Magazine further explains the concept indicating that this approach is “particularly promising as a simple theoretical explanation of a number of brain processes, including development, memory formation, object and sound recognition, and the sleep-wake cycle.”
“The magic thing that happens is it’s able to generalize,” said Yann LeCun, director of the Center for Data Science at New York University. “If you show it a car it has never seen before, if it has some common shape or aspect to all the cars you showed it during training, it can determine it’s a car.”
Researchers aren’t alone in attempting to unlock some of the mysteries of the Brain. Earlier this year, President Obama unveiled the “BRAIN” Initiative, short for Brain Research through Advancing Innovative Neurotechnologies. It is a bold new research effort to revolutionize our understanding of the human mind. The initiative launched with approximately $100 million in funding for research supported by the National Institutes of Health (NIH), the Defence Advanced Research Projects Agency (DARPA), and the National Science Foundation (NSF) in the President’s Fiscal Year 2014 budget.
Related: DARPA Issues Request For Information to Create a Computer Brain
According to the announcement on The White House Blog, “The Initiative promises to accelerate the invention of new technologies that will help researchers produce real-time pictures of complex neural circuits and visualize the rapid-fire interactions of cells that occur at the speed of thought. Such cutting-edge capabilities, applied to both simple and complex systems, will open new doors to understanding how brain function is linked to human behavior and learning, and the mechanisms of brain disease.”
In January the Human Brain Project (HBP) received $1.3 billion in funding from the European Commission. The project is the brain child of Swiss neuroscientist Henry Markram who plans to create a precise simulation of a human brain using a supercomputer. The project hopes to” develop six ICT platforms, dedicated respectively to Neuroinformatics, Brain Simulation, High Performance Computing, Medical Informatics, Neuromorphic Computing and Neurorobotics. In all cases, the platforms will build on existing capabilities, some but not all developed by the HBP partners.”
Along with these advancements, startups have also begun to enter this new computing frontier. Last week Toronto, Canada based InteraXon, creators of Muse: the brain sensing headband, announced it has raised $6 million in Series A financing from a number of prominent investors.
The company claims that its technology can “monitor the neurons in the brain, as they fire, they generate magnetic fields that can be easily read from the head using an Electroencephalograph, or EEG. The InteraXon system analyses these readings and separates the waves by frequency into alpha, beta, gamma, and theta waves, each of which is associated with a particular conscious state. After analyzing and sorting the waves by type, our software compares the amount of energy in each band and generates a control signal that correlates to the strength of a particular brain state.”
“With InteraXon’s Muse, you can merely use your brain to make things happen. By turning what seems like science fiction into reality, they’re defining an entirely new category of wearables that are thought controlled,” says Sundeep Peechu, partner at Felicis Ventures.
Combining the growing interest in wearable technology with the nearly limitless potential of the human brain is an exciting frontier.
Who is behind BRAIN?
Dr. Francis Collins, the Director of the National Institutes of Health, highlights the BRAIN Initiative. The BRAIN (Brain Research through Advancing Innovative Neurotechnologies) initiative is a new proposal by President Obama for a new, bold research effort to revolutionize our understanding of the human mind and advance the President’s vision for creating jobs and building a thriving middle class by investing in research and development.
Francis Sellers Collins (born April 14, 1950) is an American physician-geneticist noted for his discoveries of disease genes and his leadership of the Human Genome Project (HGP). He currently serves as Director of the National Institutes of Health in Bethesda, Maryland. Prior to being appointed Director, he was the founder and president of the BioLogos Foundation, an organization which promotes discourse on the relationship between science and religion and advocates the perspective that belief in Christianity can be reconciled with acceptance of evolution and science. Collins also wrote the New York Times bestseller, The Language of God: A Scientist Presents Evidence for Belief, which discusses Collins’ conversion from atheism to Christianity, evaluates the evidence for Christianity, and argues for theistic evolution. In 2009, Pope Benedict XVI appointed Collins to the Pontifical Academy of Sciences.
Collins accepted an invitation in 1993 to succeed James D. Watson as Director of the National Center for Human Genome Research, which became National Human Genome Research Institute (NHGRI) in 1997. As Director, he oversaw the International Human Genome Sequencing Consortium. A working draft of the human genome was announced in June 2000, and Collins was joined by US President Bill Clinton and biologist Craig Venter (Maryland’s Prometheus!) in making the announcement.
Francis Collins Pushing Drugs as the only solution at TED
How to make a digital human brain
Futurists warn of a technological singularity on the not-too-distant horizon when artificial intelligence will equal and eventually surpass human intelligence. But before engineers can make a machine that truly mimics a human mind, scientists still have a long way to go in modeling the brain’s 100 billion neurons and their 100 trillion connections.
Already in Europe, neuroscientist Henry Markram and his team established the controversial but ambitious Human Brain Project that’s seeking to build a virtual brain from scratch. Earlier this year, U.S. President Barack Obama announced that millions of federal dollars will be put toward efforts to map the brain’s activity through the Brain Research through Advancing Innovative Neurotechnologies, or BRAIN, Initiative.
Friday night, a panel of experts at the World Science Festival here in New York parsed through challenges such undertakings pose for science and technology. The following are four of the hurdles to making a digital brain discussed during the session "Architects of the Mind: A Blueprint for the Human Brain."
1. The brain isn’t a computer
Perhaps scientists could build computers that are like brains, but brains don’t run like computers. Humans have a tendency to compare the brain to the most advanced machinery of the day, said developmental neurobiologist Douglas Fields, of the National Institute of Child Health and Human Development. Though our best analogy is a computer right now, "it’s humbling to realize the brain may not work like that at all," Fields added.
The brain, in part, communicates through electrical impulses, but it’s a biological organ made of billions of cells, and cells are essentially just "bags of seawater," Fields said. The brain has no wires, no digital code and no programs. Even if scientists could aptly use the analogy of computer code, they wouldn’t know what language the brain was written in.
2. Scientists need better technology
Kristen Harris, a neuroscientist at the University of Texas at Austin, slipped into a computer analogy herself, saying that researchers tend to think a single brain cell has the equivalent power of a laptop. That’s just one way of illustrating the daunting complexity of the processes at work in each individual cell.
Scientists have been able to look at the connections between individual neurons in amazing detail, but only by way of a painstaking process. They finely slice neural tissue, scan hundreds of those slices under an electron microscope, and then put those slices back together again in a computer reconstruction, explained Murray Shanahan, a professor of cognitive robotics at Imperial College London.
To repeat that process for an entire brain would take lifetimes using current technology. And to get an idea of the average brain, scientists would have to compare these trillions of connections across many different brains.
"The big challenge is giving me the scientist the tools to do that analysis at a faster level," Harris said. She added that physicists and engineers might be able to help scientists scale up, and she is hopeful the BRAIN initiative will spur such collaboration.
3. It’s not all about neurons
Even if newer machines could efficiently map all of the trillions of neuron connections in the brain, scientists would still have to decipher what all of those links mean for human consciousness and behavior.
What’s more, neurons only make up 15 percent of the cells in the brain, Fields said. The other cells are called glia, which is the Greek word for "glue." It was long thought that these cells provided structural and nutritional support for the neurons, but Fields said glia might be involved in vital background communication in the brain that’s neither electric nor synaptic.
Scientists have detected changes in glial cells in patients with amyotrophic lateral sclerosis(ALS), epilepsy and Parkinson’s disease, Fields said. A 2011 study found abnormalities in glial cells known as astrocytes in the brains of depressed people who had committed suicide. Fields also pointed out the neurons in Einstein’s brain were not remarkable, but his glial cells were bigger and more complicated than those found in an average brain.
4. The brain is part of a bigger body
The brain is constantly responding to input from the rest of the body. Studying the brain in an isolated way inherently ignores the signals coming in through those pathways, warned Gregory Wheeler, alogician, philosopher and computer scientist at Carnegie Mellon University.
"Brains evolved in order to make the body move around in the world," Wheeler said. Instead of modeling the brain in a disembodied way, scientists should put it in a body a robot body, that is.
There are already some examples of the kind of machine Wheeler has in mind. He showed the audience a video of Shrewbot, a robot modeled after the Etruscan pygmy shrew created by researchers at the Bristol Robotics Lab in the United Kingdom. The signals coming in from the robot’s sensitive "whiskers" influence its next moves.
Brain Implants to Enhance Our Senses?
DARPA to Genetically Engineer Humans by Adding a 47th Chromosome
Secret DARPA Mind Control Project Revealed: Leaked Document
So It Begins: Darpa Sets Out to Make Computers That Can Teach Themselves
Latest News from our Front Page
Virginia TV Shooter Said, "Jehovah Spoke To Me That I Should Take Action"
What causes a guy to go off the plantation enough so that he would murder in cold blood, two of his former colleagues? Guesses are abounding, such as racial tension, mental illness, psychotropic prescription drugs, narcissism, etc. What about the angle of how a destructive religious cult shaped him in his formative years, so that later in life, he would ...
Invasion into Macedonia from Greece - Border Mayhem
Description from video page: These are the desperate scenes at Greeceâ€™s border with FYR Macedonia as overwhelmed security forces made sporadic attempts to stem the flow of invaders heading northwards.
Riot police set off stun grenades and hit invaders with batons but that did little to stop them passing into FYR Macedonia.
Skopje had declared a state of emergency on Thursday (August ...
RamZPaul: TV Reporter Murdered Live On Air
Youtube description: The gunman responsible for the live on air shooting of reporter Alison Parker and cameraman Adam Ward said part of his motivation for the attack was Parker making “racist comments.”
Vid that killer made:
(note the actual shooting vid is being censored fast)
Jesse Benn advocating "White Wounding: just weeks ago:huffingtonpost.com
White House Pushes Race War, Then Blames Guns For Flanagan's "Race War"
"You want a race war (expletive)?" Asked shooter
The Obama administration is blaming guns for the WBDJ-TV shooting, even though the shooter, Vester Flanagan, was influenced by the White House’s race baiting.
The White House has been falsely insinuating that racism is the dominant factor behind numerous events over the past several years, even those that didn’t involve race at all, and ...
How Google Destroyed the Internet
The internet was created to resolve a simple problem: in communications networks, any central node through which all messages passed was vulnerable to attack or takeover. To counter this, engineers designed a network where any node would pass messages to other nodes, routing around any damage.
Then came commerce and the democratization of the internet.
Under this model, frightened sheep flock to ...
|More News » |