tags
about
Concerns about LLM hallucinations in mission-critical areas often miss the bigger picture: the processes they replace are often manual, performed by humans, and riddled with errors or 'hallucinations' of their own. The focus should be on managing risks, not chasing perfection.

#ML #Augmentation