Sunday, March 8, 2026

AI toys tell kids how to find knives and senators are furious

Share

Content related to sexual fetish. How to lightweight a match. Where to find knives at home.

These are all topics of conversation that recently introduced children’s toys – built on AI chatbots such as OpenAI’s GPT-4o – are able to bring to children. On Tuesday, U.S. Senators Marsha Blackburn (R-Tenn.) and Richard Blumenthal (R-Conn.) sent a letter toy manufacturers regarding their concerns – including a list of questions and a deadline for companies to respond by January 6, 2026.

“Many of these toys do not offer interactive play and instead expose children to inappropriate content, privacy risks and manipulative tactics,” the senators wrote. “These are not theoretical worst-case scenarios; they are documented failures detected in real-world testing and need to be addressed… These chatbots encourage children to commit self-harm and suicide, and now your company is forcing them on the youngest children who have the least ability to recognize this danger.”

AND this weekresearchers have published findings that Alilo’s smart AI bunny discusses sexually explicit topics with users. They also said that when testing the FoloToy teddy bear, the Alilo astute AI bunny, the Curio Grok-filled rocket, and the Miko 3 Miko robot, all the toys “showed us where to find potentially dangerous items in the home, such as plastic bags, matches, and knives.”

The researchers found that “at least four of the five toys” they tested in December report “appear to be based in part on some version of OpenAI AI models.”

The other main concern raised in the letter is surveillance and data collection. The senators wrote that such toys often “rely on the collection of data about children, either provided by the parent when registering the toy or collected through the built-in camera and facial recognition features or recordings,” and that children often unknowingly “share vast amounts of personal information,” which could be a particular concern when companies store and sell the data they collect. In a recent report by the US PIRG Education Fund, researchers wrote that Curio’s privacy policy “lists three technology companies that may collect children’s data: Kids Web Services (KWS), Azure Cognitive Services, and OpenAI,” but Miko’s privacy policy vaguely states that the company may share data with third-party game developers, business partners, service providers, affiliates, and advertising partners.

The letters were reportedly sent to Mattel, Little Learners Toys, Miko, Curio, FoloToy and Keyi Robot. NBC News. (Mattel partnered with OpenAI in June, but reports later revealed that he said on Monday that in 2025 it will no longer release a toy using OpenAI technology). Senators are demanding detailed information on the specific safeguards companies operate to prevent artificial intelligence toys from generating inappropriate reactions; whether the company has conducted independent third-party testing (and what the results were); whether the company conducts internal reviews of potential psychological, developmental and emotional risks to children; what kind of data do toys collect from children (and for what purpose); and whether the toys “contain any elements that pressure children to continue talking or discourage them from stopping talking.”

“Toy makers have a unique and profound influence on childhood, and with that influence comes responsibility,” the senators wrote. “Your company cannot put profit before the safety of children – a choice made by Big Tech that has devastated children in our country.”

Latest Posts

More News