• Hi Guest!
    If you appreciate British Car Forum and our 25 years of supporting British car enthusiasts with technical and anicdotal information, collected from our thousands of great members, please support us with a low-cost subscription. You can become a supporting member for less than the dues of most car clubs.

    There are some perks with a member upgrade!
    **Upgrade Now**
    (PS: Subscribers don't see this gawd-aweful banner
Tips
Tips

The dangers of AI. A ChatGPT story

Basil

Administrator
Boss
Offline
the parents of a boy who committed suicide with the help of ChatGPT speak to Megyn with their lawyer. This chatbot facilitated a young boy’s suicide. Parents and grandparents- be careful about what your kids are doing online.

 
When I first became a network manager (ca. 1995) I cautioned faculty about this type of thing. Would you give your child a free, long-distance phone line in the bedroom, where anyone in the world could contact your child in private? Probably not.

But we give them smartphones as early as elementary school - and let them have their fun with no parental supervision.

Very very sad.


Edit: https://www.npr.org/sections/shots-...-safety-openai-meta-characterai-teens-suicide
 
When I first became a network manager (ca. 1995) I cautioned faculty about this type of thing. Would you give your child a free, long-distance phone line in the bedroom, where anyone in the world could contact your child in private? Probably not.

But we give them smartphones as early as elementary school - and let them have their fun with no parental supervision.

Very very sad.


Edit: https://www.npr.org/sections/shots-...-safety-openai-meta-characterai-teens-suicide
Parents (and grandparents) need to be aware of these dangers and be proactive in talking to the kiddos about it.
 
Not to mention it also keeps kids from learning to really think and question and verify the information they're getting. AI is already known for making up "facts" and seeming to work to be agreeable to what's being asked of it. Personally I can't see ever getting into ChatGPT or others, I just can't trust what I might be seeing.
 
This is beginning to remind me of the "companion robots" and "robot ai pets" used to combat loneliness.

If something is created to make us feel better, we'll use it. The big danger of course is that people (teens, elderly, etc.) who feel lonely, or feel that other people "don't understand them", will use AI to feel better. AI "listens" and doesn't say "I'm busy - Parents in Room - Gotta go now". Users aren't looking for "truth", they're looking for acceptance and understanding.

DigitalMan.gif
 
Back
Top