News Researchers hope to quash AI hallucination bugs that stem from words with more than one meaning

Ralston18

Titan
Moderator
What I red was that AI cannot understand via context.

If that is reel then it could be difficult to determine if weather or knot some AI response is an hallucination or knot.

Four example: how do I fix the breaks on my car?

Answers could go right down into the rabbit whole.

Such errors could be two much or to funny for reeders to bare.
 
  • Like
Reactions: Sluggotg

bluvg

Commendable
Jan 15, 2022
44
46
1,560
Oh. I think someone could have found a use for
Lojban, — a synthetic language made especially not to have any syntactic ambiguity.
English is awful in so many ways. How many mistakes (big and small) could have been prevented by addressing this flawed subsystem we rely on heavily?