British scientists want to make emo robots
Sonny from I, Robot just might be getting some company soon if the Feelix Growing project succeeds in its quest to create robots that has the capability to interact with people emotionally. Manned by 35 roboticists, developmental psychologists and neuroscientists, the aim is to enable robots to “learn from humans and respond in a socially and emotionally appropriate manner.”
Dr. Lola Canamero of the University of Hertfordshire believes that, “the human emotional world is very complex but we respond to simple cues, things we don’t notice or we don’t pay attention to, such as how someone moves.” Given this, it may then be not farfetched for robots to have social and emotional interactions with humans via the feedback they receive from us.
For example, the emotive robots may detect expressions through vision cameras, audio, contact sensors, and sensors that “can work out the distance between the machine and the humans.” To do this, they will be using artificial neuron networks to detect the expressions in a motion. “Neural networks learn patterns from examples of observation,” explains Dr. Canamero.
The cost of the project is approximately € 2.3m (US$ 3m). I don’t know about you guys, but we’ve got more than enough emotions running in this world already. Maybe it’ll be best to just leave them objects alone, eh?
Via BBC
Sonny from I, Robot just might be getting some company soon if the Feelix Growing project succeeds in its quest to create robots that has the capability to interact with people emotionally. Manned by 35 roboticists, developmental psychologists and neuroscientists, the aim is to enable robots to “learn from humans and respond in a socially and emotionally appropriate manner.”
Dr. Lola Canamero of the University of Hertfordshire believes that, “the human emotional world is very complex but we respond to simple cues, things we don’t notice or we don’t pay attention to, such as how someone moves.” Given this, it may then be not farfetched for robots to have social and emotional interactions with humans via the feedback they receive from us.
For example, the emotive robots may detect expressions through vision cameras, audio, contact sensors, and sensors that “can work out the distance between the machine and the humans.” To do this, they will be using artificial neuron networks to detect the expressions in a motion. “Neural networks learn patterns from examples of observation,” explains Dr. Canamero.
The cost of the project is approximately € 2.3m (US$ 3m). I don’t know about you guys, but we’ve got more than enough emotions running in this world already. Maybe it’ll be best to just leave them objects alone, eh?
Via BBC