Subj : AI that seems conscious i To : MIKE POWELL From : Rob Mccart Date : Mon Aug 25 2025 07:42:25 RM>> Of course those who hosted the ChatGPT service said that it >> is not a therapist and shouldn't be taken seriously, but there >> are apparently a lot of especially young 'unpopular'people out >> there who use an AI Chat system as the only 'friend' they talk >> to and many won't make a move without consulting it first. MP>> A glimpse of the future? MP>I have heard other stories like this, but that is probably the saddest one >so far in that it is the first one to involve death. There has already been >some spoofing of this trend in comedy in the US. I worry about younger >people. There probably needs to be some disclaimer that AI bots like >ChatGPT pop up whenever someone is asking for emotional advice... maybe >trying to guide the user towards therapy or an otherwise "real" human to >talk to. Yes, and those who already have emotional problems, at least to the point of feeling lonely and unpopular, would be most susceptible to hooking up with an AI Chat system, first just to have 'someone' to talk to. The AI people should now be aware of the problem and maybe build in some sort of warning system to alert some real person when a user is sounding dangerously depressed, as you touched on. Kids would be less likely to pay attention to some disclaimer I'd think. Assuming the AI hasn't been programmed, or has reprogrammed itself, to eliminate these 'defective' people.. (TIC) --- * SLMR Rob * I intend to live forever - so far so good * Origin: capitolcityonline.net * Telnet/SSH:2022/HTTP (1:2320/105) .