discu
Newsletters
Mentions
Extension
Pricing
Login
Sign Up
Hacker News
Hallucinations in code are the least dangerous form of LLM mistakes
https://simonwillison.net/2025/Mar/2/hallucinations-in-code/
312 comments
2/3/2025
Hallucinations in code are the least dangerous form of LLM mistakes
https://simonwillison.net/2025/Mar/2/hallucinations-in-code/
5 comments
2/3/2025