Language-learning techniques designed for children are being used in a bid to break new ground by developing algorithms that enable robots to learn and understand concepts.
As part of the project by Plymouth University researchers, two robots will be built featuring software that allows them to interact with each other to exchange learned information like humans.
The work at Plymouth is the latest in a number of ambitious initiatives to apply human learning processes to robotic systems. Earlier this year (7 April) The Engineer reported how the European COSPAL project plans to help robots learn to carry out actions. The Plymouth project, meanwhile, concentrates on word meaning.
Tony Belpaeme at Plymouth said: 'Robots still don't know the meaning of things. The only techniques we have at the moment are using mathematical tricks and statistics to produce more or less sensible replies.
'For example, search engines typically try to match search terms to words in web pages, and although this is a surprisingly effective way of finding relevant webpages, they do not have any understanding of what it is you're looking for.
'So, simple queries such as 'does a stone float' cannot be answered by a search engine as it does not know the meaning of words.'
He added: 'The way out is to let computers and robots really experience the meaning of words by going through a process where the meaning of words is gradually learned, much as children learn the meaning of words.'
The first robots will be designed to encourage human interaction, with a long neck, similar to an industrial robot arm, and a face in place of a grip so it can look around or at an object from all sides.
There will be speakers, a microphone and two cameras in the robot's head, which Belpaeme said will be able to pick out humans in a room, make eye contact, track human gaze and interpret pointing gestures and correlate them with what is being pointed at.
'We are going to make the robot look cute and we are going to try to trick people into teaching the robot things just as they would a small child,' said Belpaeme.
'It will have the intelligence of a three- to four-year-old in terms of competence of understanding things but we are aiming at it having interactions of a two- to three -year-old. If we can replicate that, I would be over the moon.'
The researchers hope to explore, with help from developmental psychologists' knowledge about child language acquisition, how children learn words in the early years of life and then implement 'tricks' that they use onto the robots.
'For example, one trick is that children always assume that if they hear a new word, it refers to the whole object. Suppose you show them a doll and say 'look, arm', they will think that 'arm' means a whole doll. So this object constraint that they have helps them to learn certain things pretty fast.
'They also make typical mistakes, like when they first learn about dogs, they over-generalise and think that anything that has four legs is a dog, and then they narrow it down,' said Belpaeme.
The Plymouth team plans to speed up the learning process by training more than one robot in different locations then bringing them together to teach each other the different information learned without any human guidance or control.
'For example, you have a robot with one family where they have a cat and they could learn what a cat is, and you could have a second robot in another family with a dog so that robot learns what a dog is.
'It is not really easy to copy and paste information from one robot brain to another because if you do that you are going to upset everything that the robot already knows. What we can do is let robots talk to each other over an internet connection, interact and exchange information in the way that children would exchange information in a nursery,' said Belpaeme.
The aim is for the robots to learn concepts including the meaning of words, names of objects, simple verbs and relations between objects, such as on, in, near or far.
Their ability to understand these will then be tested through simple games.
If the software that the researchers develop works successfully on the initial robots, they plan to build it into the iCub, a robot resembling a five-year-old child, which they built as part of a previous project called the iTalk.
Anh Nguyen
Oxa launches autonomous Ford E-Transit for van and minibus modes
I'd like to know where these are operating in the UK. The report is notably light on this. I wonder why?