discu
Newsletters
Mentions
Extension
Pricing
Login
Sign Up
Hacker News
The Dual LLM pattern for building AI assistants that can resist prompt injection
https://simonwillison.net/2023/Apr/25/dual-llm-pattern/
109 comments
13/5/2023
Lobsters
The Dual LLM pattern for building AI assistants that can resist prompt injection
https://simonwillison.net/2023/Apr/25/dual-llm-pattern/
7 comments
25/4/2023
ai , security