By Douglas Heaven | NewScientist
Microsoft’s depth-sensing camera will gain unprecedented information about us while we play, allowing games to adjust their difficulty in response
You’re cornered and wounded. Cowering behind a crate, all you can do is hide and wait for the acid-spraying alien to move on. You desperately look for a pattern in its movements, hoping for a chance to sneak past to safety.
So far, so scripted. But the chance still doesn’t come. As you’re stuck in your corner, heart rate rising and a sheen of perspiration forming on your face, a camera by the TV feeds data to the game. The system is constantly judging. How much longer can you take the tension? Is this still fun?
The latest game spawned from the Alien film franchise is being made by Creative Assembly, a game studio in Horsham, UK. It is likely to be one of the first games to explore the potential of Microsoft’s next-generation Kinect sensors for the Xbox One games console. Announced at the same time as the unveiling of the Xbox One last week, the new Kinect is a huge improvement on its predecessor (see "New wave"). It will have HD colour and infrared cameras that can see if your eyes are open or closed in the dark. It will be able to detect your pulse from fluctuations in skin tone and, by measuring how light reflects off your face, it will know when you start to sweat.
This will allow the new Kinect to bring emotional gaming to your living room. Games can use the biological data to orchestrate your experience by adjusting the difficulty or intensity in real time, depending on how excited the system thinks you currently are.
"The key is understanding what makes games fundamentally satisfying," says Scott Rigby, co-founder of Immersyve, a gaming consultancy in Celebration, Florida, that advises on ways to engage players by gathering this biometric data. "I love the promise of it."
But Rigby warns that detecting signs of high emotion in a player does not automatically mean they are having a good time. "If I poke you with a stick, there is a spike in arousal," he says. "But that doesn’t mean you like it and want me to do it again."
Biometric data from Kinect will still need to be combined with assumptions about what kind of emotional response a section of game is aiming for, says Rigby. For example, in a battle against a big boss, players will typically tolerate dying about four times before getting frustrated, he says. After that, a game might be programmed to lower the level of difficulty. Feedback could be used to tailor this to an individual’s preference.
Our bodies give away other clues too. "Kinect could measure how much mental effort you’re putting into a game or a specific task within a game," says games psychologist and writer Jamie Madigan. "And it can tell when you’ve given up."
For example, your pupils dilate when you are engaged in a cognitive challenge, and return to normal when you have given up because something is too hard. "If the Kinect could reliably detect pupil sizes, it would open up a whole new level of scaling game difficulty," says Madigan. For example, a puzzle game could get harder until the player enters the "zone" of peak performance – when gaming is at its most satisfying. It could also offer a hint when it detects you have given up.
Read the full article at: newscientist.com