Subj : Re: ChatGPT Writing To : Nightfox From : jimmylogan Date : Fri Dec 26 2025 05:08 pm -=> Nightfox wrote to jimmylogan <=- Ni> Re: Re: ChatGPT Writing Ni> By: jimmylogan to Nightfox on Wed Dec 03 2025 08:58 pm ji>> But again, is it 'making something up' if it is just mistaken? Ni>> In the case of AI, yes. ji> Gonna disagree with you there... If wikipedia has some info that is wrong, ji> and I quote it, I'm not making it up. If 'it' pulls from the same source, ji> it's not making it up either. Ni> For AI, "hallucination" is the term used for AI providing false Ni> information and sometimes making things up - as in the link I provided Ni> earlier about this. It's not really up for debate. :) :-) Okay - then I'm saying that in MY opinion, it's a bad word to use. Hallucination in a human is when you THINK you see or hear something that isn't there. Using the same word for an AI giving false information is misleading. So I concede it's the word that is used, but I don't like the use of it. :-) Ni>> I've heard of people who Ni>> are looking for work who are using AI tools to help update their resume, Ni>> as well as tailor their resume to specific jobs. I've heard of cases Ni>> where the AI tools will say the person has certain skills when they Ni>> don't.. So you really need to be careful to review the output of AI Ni>> tools so you can correct things. Sometimes people might share Ni>> AI-generated content without being careful to check and correct things. ji> I'd like to see some data on that... Anecdotal 'evidence' is not always ji> scientific proof. :-) Ni> That seems like a strange thing to say.. I've heard about that from Ni> job seekers using AI tools, so of course it's anecdotal. I don't know Ni> what scientific proof you need to see that AI produces incorrect Ni> resumes for job seekers; we know that from job seekers who've said so. Ni> And you've said yourself that you've seen AI tools produce incorrect Ni> output. Sorry - didn't mean to demand anything. I just meant the fact that someone says it gave false info doesn't mean it will ALWAYS give false info. The burdon is still on the user to verify output. I was using it to help me fill out a spreadsheet and had to go back and correct some entries. Had I turned it in 'as is' it would have had MY signature on it, and I would have been responsible. Ni> The job search thing isn't really scientific.. I'm currently looking Ni> for work, and I go to a weekly job search networking group meeting, and Ni> AI tools have come up there recently. Specifically, recently there was Ni> someone there talking about his use of AI tools to help customize his Ni> resume for different jobs & such, and he talked about needing to check Ni> the results of what AI produces, because sometimes AI tools will put Ni> skills & things on your resume that you don't have, so you have to make Ni> edits. By 'scientific data,' I guess I meant I'd like to see the output AND the input. I've learned that part of getting the right info is to ask the right question, or ask it in the right way. :-) ji> If that's the definition, then okay - a 'mistake' is technically a ji> hallucination. Again, that won't prevent me from using it as the tool it Ni> It's not a "technically" thing. "Hallucination" is simply the term Ni> used for AI producing false output. .... Support bacteria. It's the only culture some people are exposed to! --- MultiMail/Mac v0.52 þ Synchronet þ Digital Distortion: digitaldistortionbbs.com .