

32·
6 days agoYes, and the system can figure out the correct answer as soon as you point out that hallucination is wrong. Somehow ChatGPT is even more unwilling to say “no” or “don’t know” recently.
Yes, and the system can figure out the correct answer as soon as you point out that hallucination is wrong. Somehow ChatGPT is even more unwilling to say “no” or “don’t know” recently.
Fuck WinRAR.
Ask mom for permission!
But they did, just with different words.