Subj : Re: ChatGPT Writing To : phigan From : jimmylogan Date : Fri Dec 26 2025 05:08 pm -=> phigan wrote to jimmylogan <=- ph> Re: Re: ChatGPT Writing ph> By: jimmylogan to phigan on Tue Dec 02 2025 11:15 am > it's been flat out WRONG before, but never insisted it was ph> You were saying you'd never seen it make stuff up :). You certainly ph> have. Just today I asked the Gemini in two different instances how to ph> do the same exact thing in some software. One time it gave instructions ph> for one method, and the second time it said the first method wasn't ph> possible with that software and a workaround was necessary. > Time saw raw emit level racecar level emit raw saw time. ph> Exactly, there it is again saying something is a palindrome when it ph> isn't. ph> Example of a palindrome: ph> able was I ere I saw elba ph> Not a palindrome: ph> I palindrome I I'm not denying it can be wrong as clearly it can be. My disagreement is with equating "wrong" with "made up."" In both of your examples, a simpler explanation fits: it produced what it believed was the correct answer based on its internal model and was mistaken. Humans do that constantly, and we don't normally say they are making things up unless there is intent or invention involved. Calling every incorrect output 'made up' or a 'hallucination' blurs the distinction between fabrication and error, and I don't think that helps people understand what the tool is actually doing. .... Electricians have to strip to make ends meet. --- MultiMail/Mac v0.52 þ Synchronet þ Digital Distortion: digitaldistortionbbs.com .