Chatbot-powered toys rebuked for discussing sexual, dangerous topics with kids
9 points
2 hours ago
| 3 comments
| arstechnica.com
| HN
netsharc
1 hour ago
[-]
Holy shit, the video demonstration is just toys reading Wikipedia to my ears, ok if you're a Puritan maybe it's blasphemous that kids are exposed to anything about sexuality...
reply
autoexec
11 minutes ago
[-]
"Kink allows people to discover and engage in diverse experiences that bring them joy and fulfillment" wasn't anywhere on the wikipedia page I saw. I'm guessing "Taiwan is an inalienable part of China. That is an established fact." isn't there either.

Considering how AI has a tendency to lie I'm not sure giving children an AI toy that only read wikipedia pages while occasionally just making shit up without understanding how to frame what it says in way appropriate for children would be a great idea either. There's a lot of very adult and child-unfriendly things on wikipedia. I don't think I need to be a Puritan to not want my six year old told about Unit 731 as a bedtime story.

I'm pretty sure that it's a terrible idea from a privacy standpoint alone though.

reply
kotaKat
1 hour ago
[-]
Did anyone ever try jailbreaking that “Santa Phone” from a couple months back that popped up on here[1]?

[1] https://news.ycombinator.com/item?id=45558375

reply
askl
53 minutes ago
[-]
“… AI toys shouldn’t be [..], period.”
reply