LLM hallucination is currently seen as a problem for developers
But the products that succeed in this era won't be the ones that somehow suppress all hallucination.
They will be the ones that successfully pass on the cost of hallucination to users
Edited 1y ago