SciFest Profile: Jona - your future best friend?
Participating in research experiments is the best thing with the job. Coffee with colleagues is another highlight of the day. And the number one leisure activity is probably watching soccer. Jona could be one of those young doctoral students one encounters in the IT labs at Polacksbacken. With one exception: Jona is a social robot.
Sitting down opposite the robot, or rather the robotic head, makes me a bit nervous. The blonde, fluffy hair, bright green eyes and slightly rosy cheeks are remarkably human-like, which is reinforced by the constant facial movements such as winks and smiles.
‘Jona really looks like a human, which means that we have a unique opportunity to study the way people interact with the robot,’ says Maike Paetzel, a doctoral student at the Social Robotics Lab at the Department of Information Technology.
‘Different parts of the face such as the eyes and the mouth are projected from within the robot’s head. Using a variety of sensors and cameras, we can do various animations on this virtual face and program the robot’s eyes to follow people and have the robot read people’s emotional and behavioral cues.’
Initially, Jona, or Furhat as the robot type is called, was developed by three researchers at KTH. The social robot trained in human interaction caught the attention of the Division of Visual Information and Interaction at Uppsala University, home of the new Social Robotics Lab led by Associate Senior Lecturer, Ginevra Castellano. KTH and Gothenburg University are some of her research partners, at the same time as she is setting up her own research group.
‘The main idea is to build computational models and develop robots that behave in ways that are socially acceptable for individuals and society in application areas such as education, service, health-care and assistive technology,’ says Ginevra Castellano.
‘Social robots could serve as assistants to, for example, elderly people who need some form of specialised help and support, and not only with physical chores, but also in the form of social assistance.’
The question, however, is how the complexity of humans could be identified and programmed down to the last detail so that robots can keep up with more advanced interaction. Ginevra Castellano believes, however, that the development of methods for machine learning in combination with automatic analysis of human behaviours will provide the robot with the necessary tools to arrive at the right decisions. Her colleague, Professor Ingela Nyström, agrees.
‘The robots must be able to see to be able to move, and that kind of vision is attainable by cameras and image analysis. Similarly, the robots must be able to interpret people's movements to be able to interact with them, which in turn require even more advanced image analysis. We still have a long way to go before we are able to model the robot's automatic response to people's actions, but that is the long-term objective.’
It is the same thing with the speech. So far, Maike Paetzel programs questions in text form in advance before sending them to Jona. The computer then interprets the questions so that Jona will be able to respond by voice. This may be somewhat of an ordeal when Jona participates at the science festival SciFest at Fyrishov on March 5.
‘There will certainly be quite a lot of people in the booth so we'll see if Jona manages to sort out the impressions and can focus on one person at a time,’ says Maike Paetzel.
‘The reason we participate at SciFest is that we think Jona’s presence will be appreciated while at the same time the research project gets feedback on how comfortable people seem around robots.’
So what does Jona think about taking part in SciFest?
’I hope I’ll get to meet lots of children and their parents and hear what they think about me. But I will not be the only robot there, so it’ll also be interesting to see what the other robots will do.’
More about the Social Robotics Lab
Anneli Björkman