

whether it’s telling the truth
“whether the output is correct or a mishmash”
“Truth” implies understanding that these don’t have, and because of the underlying method the models use to generate plausible-looking responses based on training data, there is no “truth” or “lying” because they don’t actually “know” any of it.
I know this comes off probably as super pedantic, and it definitely is at least a little pedantic, but the anthropomorphism shown towards these things is half the reason they’re trusted.
That and how much ChatGPT flatters people.
I figured and I know it’s shorthand, it’s my own frustration that said shorthand has partly enabled the anthropomorphism that it’s enjoyed.
Leave the anthropomorphism to pets, plants, and furries, basically. And cars. It’s okay to call cars like that. They know what they did.