Skip to main content
Humans+Robots

Forget the class guinea pig. Meet the class robot.

AI-powered social robots can help teach vocabulary, handwriting, and even emotional intelligence.

By Hannah Thomasy

A few years ago, teachers in one Connecticut elementary school were having a problem. They were drawing up an anti-bullying curriculum for third graders, complete with role-playing scenarios, but teachers were reluctant to make students play the victims or the bullies. It’s no fun for a child to get picked on in front of every classmate, even if it’s just pretend.

At first, the teachers tried to play the roles themselves, but that just made the students laugh; they couldn’t imagine adults in those schoolyard scenarios.

Brian Scassellati, a Yale professor of computer science and mechanical engineering, thought this could be a job for robots.

Scassellati runs Yale’s Social Robotics Lab, where researchers build robots that can engage in personalized and adaptive social interactions with humans — and study how those robots can be used to support education and mental health care. His lab had teamed up with the students and teachers as part of a larger partnership with the Yale Center for Emotional Intelligence to innovate strategies for nurturing emotional intelligence in children.

For the anti-bullying lesson, Scassellati’s team brought in a pair of six-inch tall, brightly colored snowman-shaped robots to do the role-playing. The little robots could talk, move, and be dressed in different outfits. “We told them one of the robots was a bully and one of the robots needed their help,” Scassellati says.

“The kids absolutely believed that one robot was a victim, and they were very motivated to help out,” he says. Children could choose different types of responses to see how different words and actions made the robots feel. For example, when the bully made fun of the victim’s hat, children could try out different scenarios to help the victim robot feel more included — was it better to say bad things about the bully or just ask the victim to play together?

“The child wanted the robot to succeed.”

Séverin Lemaignan, professor of Social Robotics and AI,
Bristol Robotics Laboratory

While social robots — designed for tasks ranging from education to patient care and customer service — are still in relatively early stages of development, they can be powerful tools. Northeastern University computer science professor Timothy Bickmore is working on social robots that can function as public speaking coaches and provide couples counseling. Scientists around the world are examining social robots for treating autism spectrum disorders and depression, providing emotional support for hospitalized children and dementia patients, and assisting with management of conditions like diabetes. Classroom teaching is another area where researchers see great possibilities.

Early studies of the robots’ effectiveness have been promising. Personalized educational robots have taught children vocabulary words in their first and second languages, and how to do long division; they can even help children develop a growth mindset (the belief that talents can be developed through effort and perseverance).   

But there are still important aspects of human teaching that robots can’t offer. What’s more, teachers and ethical researchers have raised concerns about the possibility that overuse of robots in teaching could distract children or hamper their social development.

Several types of social robots are commercially available, with still more being developed in labs around the world. Each has different capabilities and different degrees of human-ness. There’s Nao, developed by SoftBank Robotics for use as a “front desk” virtual assistant for companies and medical centers. A small humanoid capable of moving around its environment, it’s equipped with touch sensors, speech recognition, and cameras for identifying people and objects. Nao has been repurposed for use in classrooms, as well.

Tega, a social, storytelling robot being developed at MIT’s Personal Robots Group, is designed to support early literacy education for children between four and seven years old. The fuzzy, blue and red, Furby-like machine has no legs and only stubby arms. It uses screens to produce emotive facial expressions, employing machine learning to modify its behaviors based on its interaction with a child.

The robot doesn’t simply tell a pre-recorded story or recite vocabulary words; rather, it uses artificial intelligence to analyze children’s speech, body language, and performance on tasks, then produces socially appropriate responses and personalizes educational content. For example, the robot can choose a classic children’s story with the appropriate level of complexity for the child. While reading, the robot asks the child questions about the story’s factual and emotional content.

The robot can also play collaborative vocabulary games with the child via a tablet, adjusting its behavior to offer hints and explanations, provide encouragement, ask for help, or even give wrong answers that the child needs to correct.

This type of back and forth is highly important in early education, says Hae Won Park, an MIT research scientist on the Tega project. “If you think about how young children learn anything, they learn by social interactions,” she says. “And when we use a social robot, we see that children actually treat it as a social partner.”

Park says Tega not only assesses what children have learned — which words or concepts they’ve already mastered — but also how they learned it, determining whether the child learns best through collaborative activities or through games in which they compete with the robot.

Social robots are not going to replace teachers any time soon. They can’t teach less concrete pursuits, like poetry, or design their own lesson plans. Even the most sophisticated personalization algorithms don’t allow the robots to “understand” the child in the same way a teacher or parent can. “If I had an instructor sitting there with me one-on-one, the human instructor would far outperform the robot,” says Scassellati. “We don’t like to think of it as a comparison at all. We like to think of what [robots and teachers] can do together.”

Still, teachers and experts in robot ethics have concerns. Robots can’t offer students empathy, warmth, or encouraging facial expressions and body language (at least to the extent that humans can). Those crucial, non-verbal cues make human teachers that much more effective. And, detractors say, robot education carries the potential for more damaging long-term consequences.

In a study of primary school teachers in the United Kingdom, respondents worried that the robot would be distracting for children, that it might not have sufficient social skills or empathy, and that interacting with a robot might further isolate specific children who were already somewhat detached from their peers.

“When a child is upset, or anyone really, the first thing we want is to feel that we are being heard,” says Kim Lake, an elementary school teacher from Southampton, Ontario. “Phrases such as, ‘I can see that you are upset right now,’ in combination with facial features and body language, [are] so important. Is a robot ever truly 100 percent capable of fulfilling this?”

Amanda Sharkey, a retired computer science professor at the University of Sheffield who now serves on the executive board of the Foundation for Responsible Robotics, has also raised concerns about how social robots could impact children’s interactions with peers. For example, she says, children might find it easier to interact with robots, rather than with other children, who are more complicated and won’t always want to do what the child wants. If this leads to children choosing robots as companions rather than peers, “that generally would be a bad thing for society,” says Sharkey.

Social robot creators have many of the same concerns. To discourage unhealthy attachments, both Park and Scassellati say that in their studies, they ensure that the robot-child interaction is limited to half an hour per day.

But Scassellati says robots were never designed to replace human teachers—just to support them. In large classes, teachers may have very limited one-on-one time with each child. Social robots could help assess each child’s skill level, so that when teachers do get precious individual time, they can immediately provide help, rather than trying to figure out where the child is stuck.

Séverin Lemaignan, an associate professor in Social Robotics and AI at the Bristol Robotics Laboratory, says robots’ ability to function as learners themselves can also be highly valuable.

In one project he worked on, with the Computer-Human Interaction in Learning and Instruction Lab at the Swiss Federal Institute of Technology in Lausanne, children with handwriting problems spent time acting as teachers, helping a robot to improve its handwriting over the course of five weeks. Although the actual handwriting practice involved may have been similar to if the child was practicing alone, Lemaignan says the social interaction was important for motivation and engagement in a task that the children were struggling with themselves.

“[The child had] the feeling of helping someone who was even worse than him, so in terms of self-esteem, it was very good,” says Lemaignan. “The child wanted the robot to succeed.”

Researchers strive to design robot studies with children’s emotional well-being in mind, even when doing so is seemingly unrelated to the experimental outcomes. In one study Scassellati was involved in several years ago, a robot was introduced to a kindergarten classroom for a few weeks. At the end of the experiment, Scassellati and his team were ready to unceremoniously take the robot back to the lab, but a teacher warned them that wasn’t a good idea.

“[He said,] if you just suddenly put this thing in the box and walk away with it, the kids are going to be distraught. They’ve come to think of it as part of their classroom,” Scassellati recalls. So the research team had a going-away party for the robot, just as the class would if a human student were to move away. Afterward, the robot sent postcards to the children to let them know how it was doing.

“It’s not that this is impossible to do in a way that is ethical and responsible,” says Scassellati. “It’s that it takes some thought and some consideration.”

Published on

Hannah Thomasy is a writer based out of Toronto and Seattle. 

 

Illustration by Cristina Spanò

Listen

The Experience Playlist: Together

A Spotify playlist of songs to bring us together — from Prince, CeCe Peniston, Dua Lipa, The Replacements, and more

By Ceci Menchetti