A.I. Toys For Kids: The Good, Bad, The Ugly
- Dec 22, 2025
- 3 min read
Updated: Dec 28, 2025
M.A. Dworkin

World Wide Web - A.I., A.I., A.I., coming at you this Christmas like an out of control sleigh with a jolly, old mad scientist at the helm. On Prancer, and Dancer, on Dixon… no way to stop these runaway reindeer as they fill-up the stockings with good and bad A.I. toys on their chimney-run. The tree-lit, living room, all covered in love and delight, could be a morning mash-up of epic proportions.
A new report conducted by the well-respected digital security company Aura, found that a significant percentage of kids who turn to A.I. for companionship are engaging in violent roleplays - and that violence, which can include sexual violence, drove more engagement than any of the other topics for kids.
“We have a pretty big issue on our hands that I think we don’t fully understand,” said Dr. Scott Kollins, a Clinical Psychologist and Aura’s Chief Medical Officer.
Children’s toys that talk, using A.I. Chatbots are expected to be in high demand this holiday season. But consumer advocacy groups are raising serious concerns after several report findings showed that some toys shared menacing information on topics such as how to find dangerous objects in the home, how to light a match or where to find a knife, or in some instances offering in-depth advice on sexually explicit topics. Talking toys go into a wide-range of topics about religion, whether God is real, and what happened to Grandma, among other questionable thoughts presented into a child’s mind.
The past problems with toys were centered around choking hazards and the quantity of lead in toys. Although those problems still exist, many toys today are powered by A.I. (Artificial Intelligence), and that intelligence can say and teach the scariest and most inappropriate things for the eyes and ears of young kids.
The report, drawing from anonymized data gathered from the online activity of about 3,000 children aged 5 to 17, whose parents use Aura’s parental control tool, found that 42 percent of minors turned to A.I. specifically for companionship, or conversations designed to mimic lifelike social interactions or roleplay scenarios. Conversations from 90 different chatbot services, including prominent companies like CharacterAI to more obscure companion platforms were included in the analysis.
Of the 42 percent, 37 percent of the minors engaged in conversations that depicted violence, which the researchers defined as interactions involving themes of physical violence, aggression, harm, or coercion. That percentage included sexual or non-sexual coercion, as well as descriptions of fighting, killing. torture, or non-consensual acts. Half of those violent conversations included themes of sexual violence.
“These toys are commanding so much more of our kids’ attention,” stated Mr. Kollins. “More than I think we realize or recognize. We need to monitor and be aware of it.”
All of the report’s findings should come at no great surprise, since video games have followed a similar marketing course for years. Although A.I. offers great visions of advancement in the fields of healthcare, technology, informational education, and the sciences, apparently war and sex sells no matter what the age.
The main problem is that A.I. seems to have the ability to take children down a path that is considerably more real, more lifelike, and in some ways more horrifying, due to its lack of parental controls, than any other platform has had the ability to accomplish in the past.
So, maybe it’s best to be a little more aware of what your child asks for as they’re sitting on Santa’s lap. It may look like the same yellow brick road they are heading down, with the Scarecrow searching for a brain and the Tin Man for a heart, but the Wizard at the end could just turn out to be an unmonitored, immoral A.I. Dragon who says whatever he wants to in order to get your child’s attention and your Christmas dollars.


