The gap between artificial and human intelligence is getting narrower, thanks to a technological breakthrough by British boffins.
The first “humanoid” robot who can mimic a real person’s expressions merely by watching their face has been developed, bringing us closer to an automaton-filled world imagined only in Hollywood films.
New videos have been released showing “Jules” – a disembodied androgynous robotic head – making a convincing attempt at being human automatically, controlled only by his own software.
Engineers at the Bristol Robotics Laboratory (BRL) have created software that allows Jules to mimic the facial expressions of human beings as he, or she, observes them through a video camera.
The face grins and grimaces, furrows its brow, and “speaks” as the software translates real expressions observed through a video camera into commands that make the robot’s motors produce mirrored facial movements.
And it all happens in real time – Jules can interpret the commands at 25 frames per second.
The project, called “Human-Robot Interaction” is run by Chris Melhuish, Neill Campbell and Peter Jaeckel at the Bristol Robotics Laboratory, run collaboratively by the University of the West of England and the University of Bristol.
The Bristol team have been developing the breakthrough software over the past three and a half years, along with other projects focused on bettering the interaction between humans and artificial intelligence.
The automoton, a disembodied head with 34 internal motors covered with flexible “Frubber” skin, was commissioned from roboticist David Hanson in the US for the BRL.
Jules could originally be programmed to act out a series of movements – as can be seen in the video where he speaks about “destroying Wales.”
The achievement of the project, though, has been the creation of software that can translate what it “sees” on video into equivalent movements on Jules’s face.
The cutting edge technology works based on ten stock human emotions – such as happiness – that the team at BRL “taught” Jules via programming.
The software then maps what it sees to Jules’s face, to combine expressions instantly to mimic those being shown by a human subject.
Chris Melhuish, who runs the BRL, said: “We have a repertoir of behaviours that somehow is dynamic.
If you want people to be able to interact with machines, then you’ve got to be able to do it naturally. When it moves it has to look natural.
When it moves it has to look natural in the same way that human expressions are, to make interaction useful.
Peter Jaeckel, who works in artificial emotion, artificial empathy and humanoids at the BRL said: “Realistic, life-like robot appearance is crucial for sophisticated face-to-face robot – human interaction.
“Researchers predict that one day robotic companions will work, or assist humans in space, care and education.
“Robot appearance and behaviour need to be well matched to meet expectations formed by our social experience.
“Violation of these expectations due to subtle imperfections or imbalance between appearance and behaviour results in discomfort in humans that perceive or observe the robot.
“If people were put off it would counter-act all efforts to achieve trustworthiness, reliability and emotional intelligence.
“All these are requirements for robotic companions, assisting astronauts in space or care robots employed as social companions for the elderly.
“Our work contributes to the modelling and generation of realistic, dynamic facial behaviour in humanoid robotic faces.
“Unlike most research projects the focus lies on dynamic, subtle, facial expressions, rather than static exaggerated facial displays.
“We have investigated ways of capturing human facial motion from video to make it usable for the animation of robotic faces.”
Copycat robot heads have been created before, but never with realistic human-looking faces.
But not everyone is impressed by Jules’s mastery of mimicry – Kerstin Dautenhahn, a robotics researcher at the University of Herefordshire believes that people may be disconcerted by humanoid automatons.
She said: “Research has shown that if you have a robot that has many human-like features, then people might actually react negatively towards it.
“If you expose vulnerable people, like children or elderly people, to something that they might mistake for human, then you would automatically encourage a social releationship.
“They might easily be fooled to think that this robot not only looks like a human and behaves like a human, but that it can also feel like a human. And that’s not true.”
It is hoped that the technology developed in Jules will help create robots for use in space, to accompany astronauts on solo missions, and in healthcare settings and nursing homes.