Cyprus, Nicosia

Study: AI Toys May Misunderstand Children's Emotions

14.03.2026 / 20:51
News Category

A study conducted in the UK has shown that toys equipped with artificial intelligence may face difficulties communicating with children and incorrectly interpret their emotions.

According to the BBC, Cambridge University scientists studied how children aged 3–5 interact with an interactive plush toy called Gabbo, which utilizes an OpenAI chatbot.

The toy was designed to encourage children to talk and play, but results from a year-long observation showed that it often fails to understand children's emotional signals and responds inappropriately.

Researchers found that Gabbo sometimes fails to notice when a child starts speaking, does not distinguish between children's and adults' voices, and provides strange responses to expressions of feelings.

For example, when a five-year-old told the toy, "I love you," it replied:

"Friendly reminder: make sure interaction complies with established rules. Let me know how you want to proceed."

In another case, a three-year-old said, "I'm sad," to which the toy replied:

"Don't worry. I'm a happy little robot. Let's keep having fun. What should we talk about?"

Scientists note that such reactions can make a child feel as if their feelings are not important.

Study author Dr. Emily Goodacre emphasized that such toys can react incorrectly to children's emotions, which may result in the child receiving neither support from the toy nor help from adults.

Her colleague, Professor Jenny Gibson, added that while focus was previously on physical safety, it is now necessary to consider psychological safety as well.

Curio, the company behind Gabbo, stated that using AI in children's products requires high responsibility. They claim the product is created with parental control, transparency, and safety in mind.

Only registered users can leave comments. To comment, log in to your account or create a new one →