Subj : Re: ChatGPT Writing To : jimmylogan From : Nightfox Date : Fri Dec 26 2025 05:40 pm Re: Re: ChatGPT Writing By: jimmylogan to Nightfox on Fri Dec 26 2025 05:08 pm Ni>> For AI, "hallucination" is the term used for AI providing false Ni>> information and sometimes making things up - as in the link I provided ji> :-) Okay - then I'm saying that in MY opinion, it's a bad word to use. ji> Hallucination in a human is when you THINK you see or hear something that ji> isn't there. Using the same word for an AI giving false information is ji> misleading. ji> So I concede it's the word that is used, but I don't like the use of it. ji> :-) Yeah, it's just the word they decided to use for that with AI. Although it may sound a little weird with AI, I accept it and I know what it means. There are other terms used for other things that I think are worse. :) ji> Sorry - didn't mean to demand anything. I just meant the fact that someone ji> says it gave false info doesn't mean it will ALWAYS give false info. The ji> burdon is still on the user to verify output. Yeah, that's definitely the case. And that's true about it not always giving false info. From what I understand, AI tends to be non-deterministic in that it won't always give the same output even with the same question asked multiple times. Nightfox --- þ Synchronet þ Digital Distortion: digitaldistortionbbs.com .