Using an off-the-shelf neural network, the Cambridge team trained the robot by showing it eight simple salad recipes in action, filmed by themselves. The neural network had already been programmed to identify a range of different objects, including the fruits and vegetables in the recipes like broccoli, carrot, apple and banana. Using computer vision, the robot was able to analyse the different objects in each frame of video, converting the recipes and actions to vectors which could then inform mathematical operations.
Related content
Of the 16 videos it was shown, the robot recognised the correct recipe 93 per cent of the time, even though it only detected 83 per cent of the human chef’s actions. It was also able to detect that slight variations in a recipe - such as making a double portion - were not a new recipe. The robot also correctly recognised the demonstration of a new, ninth salad, added it to its ‘cookbook’ and made it. The work is published in IEEE Access.
“We wanted to see whether we could train a robot chef to learn in the same incremental way that humans can – by identifying the ingredients and how they go together in the dish,” said first author Grzegorz Sochacki from Cambridge’s Department of Engineering, a PhD candidate in Professor Fumiya Iida’s Bio-Inspired Robotics Laboratory.
“It’s amazing how much nuance the robot was able to detect. These recipes aren’t complex – they’re essentially chopped fruits and vegetables, but it was really effective at recognising, for example, that two chopped apples and two chopped carrots is the same recipe as three chopped apples and three chopped carrots.”
While the results are promising, the videos that the robot was trained on are unlike the type of food videos that pervade social media platforms like YouTube and TikTok, where edits and visual effects are often used. The videos created by the Cambridge team were linear, with clear visual clues for each object and action, making it easier for the robot to learn. As the technology progresses, however, robots may be able to learn from the proliferation of food videos already available in online spaces.
“Our robot isn’t interested in the sorts of food videos that go viral on social media – they’re simply too hard to follow,” said Sochacki. “But as these robot chefs get better and faster at identifying ingredients in food videos, they might be able to use sites like YouTube to learn a whole range of recipes.”
The research was supported in part by Beko plc and the Engineering and Physical Sciences Research Council (EPSRC).
Promoted content: Does social media work for engineers – and how can you make it work for you?
So in addition to doing their own job, engineers are expected to do the marketing department´s work for them as well? Sorry, wait a minute, I know the...