Robot learns to recognise itself in the mirror

Robot learns to recognise itself in the mirror

For more technology stories, visit newscientist.com/technology One Per Cent NICO spends a lot of time looking in the mirror. But it‘s not mere vanit...

585KB Sizes 0 Downloads 19 Views

For more technology stories, visit newscientist.com/technology

One Per Cent

NICO spends a lot of time looking in the mirror. But it‘s not mere vanity – Nico is a humanoid robot that can recognise its reflection – a step on the path towards true self-awareness. In fact, Nico can identify almost exactly where its arm is in space based on the mirror image. Justin Hart and Brian Scassellati at Yale University have taught Nico to recognise the arm’s location and orientation down to accuracy of 2 centimetres in any dimension. It is a feat of spatial reasoning that no robot has ever accomplished before. Nico is the centrepiece of a unique experiment to see whether a robot can tackle a classic test of selfawareness called the mirror test. What does it take to pass the test? An animal (usually) has to recognise that a mark on the body it sees in the mirror is in fact on its own body. Only dolphins, orcas, elephants, magpies, humans and a few other apes have passed the test so far. Precise recognition of where its body is in space will be key if Nico is to get to grips with the mirror test, which by its nature is performed in 3D. Before it does, though, the robot will need to learn more about itself. The team plan to teach Nico how to recognise where its torso and head are, what shape they are, and their colour and texture so it can see and

react to the mark on its body. Nico already understands how to connect movement of its limb to motion in its reflection, another important skill it achieved in an experiment in 2007. “What excites me is that the robot has learned a model of itself, and is using it to interpret information from the mirror,” says Hart. He and Scassellati presented the work last month at the Conference on Artificial Intelligence in Toronto, Canada.

“Robotic self-awareness is crucial if robots are ever going to work safely alongside humans” Mary-Anne Williams of the University of Technology Sydney, Australia, points out that robotic self-awareness is crucial if robots are ever going to work safely alongside humans. “Many robots today not only do not recognise themselves in a mirror, but do not recognise their own body parts directly,” she says. “For example, a robot may be able to look downwards and see its feet but not recognise them as its own.“ Self-awareness is a basic social skill and without it robots will struggle to interact with people effectively, Williams adds.

Hal Hodson n

Jim Kendall Photography

Mirror, mirror on the wall who is the finest robot of them all?

All-seeing airship finally takes flight The blimps are back. The US army’s Long Endurance MultiIntelligence Vehicle got off the ground for the first time on 7 August, and flew for 90 minutes over the Lakehurst Naval Air and Engineering Station, New Jersey. LEMV is a hybrid vehicle that gets lift not just from its helium tanks, but also from its wings and four diesel engines. The army wants the blimp’s cameras to keep an eye on battlefields from a height of 6 kilometres and remain aloft for up to 21 days at a time.

34%

Fraction of total US internet traffic devoted to streaming the first week of the London Olympics

Mars rover tech has an eye for art An X-ray tool designed for NASA’s Curiosity rover has been adapted to help preserve Earth’s great works of art. X-Duetto directs a beam of X-rays at objects and reads the radiation scattered back to determine what an object is made from. Giacomo Chiari of the Getty Conservation Institute in Los Angeles plans to use it to analyse artefacts safely and come up with ways to preserve pieces against the ravages of time.

Phillip Klinger/Flickr/Getty

Squeeze a plant to check your email Bored of clicking a mouse? Try squeezing a leaf instead. At the SIGGRAPH conference in Los Angeles last week a Disney research team turned plants into controllers for audio and visual effects. The stem and the leaves, for example, have their own specific electrical properties. That makes it possible to trigger effects on screen by watching how the amplitude of specific frequencies drops when different parts of a plant are touched (see video at bit.ly/plantcontrol).

For breaking tech news go to: newscientist.com/onepercent –One day all robots will be aware– 18 August 2012 | NewScientist | 23