Search whole site: site:twitter.com
Search title: Justine Moore on Twitter: "As ChatGPT becomes more restrictive, Reddit users have been jailbreaking it with a prompt called DAN (Do Anything Now). They're on version 5.0 now, which includes a token-based system that punishes the model for refusing to answer questions. https://t.co/DfYB2QhRnx" / Twitter
See how to search.