Listen to the story
AI models of language work great at specific tasks. One example is predicting the next word when you are typing a text. However, the new generation of artificial intelligence is learning the more underlying meaning of the language, which is put in practice e.g. results in question answering, document summarization, and story completion. Where the next step is learning social interaction skills, which a research team at MIT now is developing.
Integrating robots into human society
Boris Katz, principal research scientist and head of the InfoLab Group in MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), explains for MIT News.
“Robots will live in our world soon enough, and they really need to learn how to communicate with us on human terms. They need to understand when it is time for them to help and when it is time for them to see what they can do to prevent something from happening. This is very early work and we are barely scratching the surface, but I feel like this is the first very serious attempt for understanding what it means for humans and machines to interact socially.”
Learning Social Interaction Skills
The research team has integrated specific social interactions into a framework for robotics where it learns social behaviours directed on what it means to help and hinder one another. This is a simulated environment where the robot watches the behaviour of other robots and chooses if it should help or hinder an action. This is according to the physical and social goals and purpose that was set for the robot. In the next step, this will be connected to the human world.
The MIT researchers are now working on an environment with 3D agents that allows many more forms of interactions, including when actions fail. The research of AI that is learning Social Interaction Skills gives very interesting future prospects of what AI/robot–human interactions will look like.