As impressive as recent advancements in areas of artificial intelligence such as facial recognition and autonomous devices, there remains much work to be done in the realm of "emotional intelligence," as a panel of leading AI researchers conveyed this week at the World Economic Forum's annual meeting. Here are some of key insights from the discussion.
Context Is Everything
Facial recognition algorithms track certain characteristic points on a person's face, such as the corners of eyes or mouths, said Maja Pantic, professor of affective and behavioral computing at the Imperial College of London. Based on the movements of those one can reason about which emotions are being expressed. But higher level of interpretation take into account what the particular emotion means—whether it's positive or negative, and what mental state it signifies, such as attentiveness or depression.
What all of that is lacking, however, is context. "Who the person is, is the first contextual question," Pantic said. "Where are they? Why they are there? There are currently no machine learning methods that truly take context into account." Currently it's possible for a machine to analyze a second of video and draw conclusions, but not days, she added: "If you talk about moods, then days are needed. How we will do it in machine learning is still not known. That's one of the challenges we have in the field."
Social Savvy
As robots become more prevalent, they will need to do more than recognize speech commands and know how to navigate around obstacles—that is, if they're really going to blend into society.
"When you intereact with people you have to take into account the social situtations they're in," said panelist Vanessa Evers, professor of human media interaction at the University of Twente. Evers and other researchers worked on a project led by airline KLM that created a robot called "Spencer," which will guide passengers who are unfamiliar with an airport from gate to gate. Spencer was tested late last year at Amsterdam's Schiphol airport.
Robots like Spencer shouldn't treat humans like they're just another object, Evers said. For example, it would be ideal if Spencer could recognize when a person ahead is taking a photo, and go around them to avoid spoiling the shot, she said. Giving robots such social skills is "essential when you want them to integrate with our environments Its one thing when they're exploring Mars or when they're in the fact. But when they come into our homes it's important they behave like something that seems intuitive to us." Algorithms like the ones Pantic develops are key to realizing this goal, Evers said.
The panel had much more to say about current and future advancements in emotional intelligence for AI. Check out the full discussion in the video below.
Reprints
Reprints can be purchased through Constellation Research, Inc. To request official reprints in PDF format, please contact Sales.