Last week we made a significant breakthrough in the Artificial Culture project. My student Mehmet Erbas demonstrated robot robot imitation for the first time. To be more precise: one e-puck robot first watching another e-puck perform a sequence of movements, then (attempting to) imitate the same sequence of movements. This sounds much easier than it is. It's difficult for two reasons. Firstly, because the e-puck can't see very well. It only has one eye - so no stereo vision and no depth information. Thus we make it easier for the robots to see each other by fitting coloured skirts in primary colours. Secondly, the robot has to translate what it has seen (which amounts to a coloured blob moving left to right and/or getting larger or smaller within its field of vision) into a set of motor commands so it can copy those movements. This transformation is what researchers in imitation in humans and animals call the correspondence problem.
Mehmet has solved these problems with some very neat coding, and the demonstration shows the the imitated dance is - on most runs - a remarkably good copy of the original. We're now figuring out how to measure the quality of imitation Qi so we can get some results and understand the average Qi, and its variance.
No comments:
Post a Comment