How to prompt engineer your babysitter
- Jacob Rodriguez
- Mar 10, 2024
- 3 min read
In the Invincible comic book series, there is a child who, when leaving their home planet for an extended period, is given a robot teacher who doubles as a friend. This robot does an excellent job over the series, adapting to the child according to their personality and learning level. A quality education for children who can’t go to a proper school. I was excited to see similar devices in the future but as soon as one came to be, I found it very off-putting.
Grok Toy
Last December Curio announced Grok, an AI plush toy meant for children ages three to twelve. For $99 you can receive a plushie stuffed with a Curio Voice BoxTM. This AI-powered toy apparently has no relationship with Elon Musk’s AI chatbot which is also named Grok. More confusingly, this toy is voiced by Grimes. Key features listed on the product's website include screen-free fun, voice-powered chat, endless conversations, and educational playtime. Still in beta, the product has received a lot of attention and surprisingly a lot of it is praise.
Once the box inside has been turned on the module listens to and transcribes the voices around it and responds. It’s like an always listening Alexa ChatGPT for children. A parent can change the personality of Grok using a prompt through the app. Grok may be shaped like a spaceship but can pretend it’s a cowboy, math teacher, or anything within reason for someone ages 3-12. Footage of people trying to trick Grok into saying something inappropriate or concerning is available online and it appears that nobody has managed to do it yet. With children seemingly safer talking to Grok than Google, what about it do I find disagreeable?
Letting Children Feed the Machine
Grok, like any other AI, collects data. Curio’s toy is constantly storing audio recordings of children that it transcribes before deleting. Transcripts are stored for 90 days before being deleted; however, they can be manually deleted sooner if desired by the owner of the device through the app. All information is passed through a third-party artificial intelligence language model according to the privacy section of Curio’s website.
Anything that passes through an LLM is training data. The idea that the responses of children are being fed into some LLM and used to make it better to talk to children is unsettling to me. I understand that my actions online are constantly being recorded to better optimize systems, but I am a consenting adult to this. Children do not understand that anything they tell this machine is not only transcribed so that their parents can see it but is being used to train a machine to better relate to children.
A Replacement for Friends?
AI could replace TV/YouTube/Video Games as “the babysitter.” Inside schools, homes, and anywhere else there are children, there are also screens. Kids grow up glued to them, scrolling and clicking before they know how to talk. Grok doesn’t have a screen and forces children to communicate with it. Grok responds with something age-appropriate, friendly, and educational.
I see problems arising when children become dependent on the chatbot. You can find online a plethora of AI “companion” apps for people who have no luck with people and find their needs better satisfied with an AI. Credit to Spike Jonze for predicting how widespread and fast something like this would happen as soon as the technology became capable. Grok won’t teach someone how to make friends or how to share, nor will it bully a kid. When Grok is designed to be the perfect friend, how many children will choose it over people? Of course, parents can monitor usage and stop this from happening but parents are notorious for letting technology become the unsupervised babysitter.
Conclusion
Whether I think it’s acceptable doesn’t matter. What matters is what parents think and how they choose to use it. I hope that if this technology becomes widespread it is handled with the care that it deserves. Who knows, maybe this will be the first time something never goes wrong. If it is then maybe I can start writing articles about AI that don’t make it sound like we’re one step away from dystopia.
Incredible how we live in a time where children’s toys have beta testing. This is so absurd it reminds me of this quote from Watch Dogs: Legion announcement at E3, “…armed drones roam the streets, deportation squads rip people from their homes, and grandad is using crypto to buy a kidney on the black market.”



Comments