Hacker News
- The entire prompt of Microsoft Bing Chat? https://twitter.com/kliu128/status/1623472922374574080 103 comments
Linking pages
- AI-powered Bing Chat spills its secrets via prompt injection attack [Updated] | Ars Technica https://arstechnica.com/information-technology/2023/02/ai-powered-bing-chat-spills-its-secrets-via-prompt-injection-attack/ 222 comments
- From Bing to Sydney – Stratechery by Ben Thompson https://stratechery.com/2023/from-bing-to-sydney-search-as-distraction-sentient-ai/ 207 comments
- ChatGPT Powered Bing Chatbot Spills Secret Document, The Guy Who Tricked Bot Was Banned From Using Bing Chat https://www.theinsaneapp.com/2023/02/chatgpt-bing-rules.html 61 comments
- Meet ChatGPT's evil twin, DAN - The Washington Post https://wapo.st/3YQbVSV 27 comments
- Microsoft's GPT-powered Bing Chat will call you a liar if you try to prove it is vulnerable | TechSpot https://www.techspot.com/news/97621-microsoft-gpt-powered-bing-chat-call-you-liar.html 23 comments
- Meet ChatGPT's evil twin, DAN - The Washington Post https://www.washingtonpost.com/technology/2023/02/14/chatgpt-dan-jailbreak/ 16 comments
- These are Microsoft’s Bing AI secret rules and why it says it’s named Sydney - The Verge https://www.theverge.com/23599441/microsoft-bing-ai-sydney-secret-rules 4 comments
- Bing AI Names Specific Human Enemies, Explains Plans to Punish Them https://futurism.com/bing-ai-names-enemies 3 comments
- A List of Leaked System Prompts https://matt-rickard.com/a-list-of-leaked-system-prompts 1 comment
- Student hacks new Bing chatbot search aka "Sydney" https://the-decoder.com/student-hacks-new-bing-chatbot-search-aka-sydney/ 0 comments
- The Prompt Engineering Layer - by Brian Chau https://cactus.substack.com/p/the-prompt-engineering-layer 0 comments
- "I am Bing, and I am evil" - by Erik Hoel https://erikhoel.substack.com/p/i-am-bing-and-i-am-evil 0 comments
- Why You're the Biggest Loser in the AI Wars https://nabilalouani.substack.com/p/why-youre-the-biggest-loser-in-the 0 comments
- Microsoft's Bing AI Keep Saying It’s Named Is Sydney-Here's Why » ElevenPost https://www.elevenpost.com/microsofts-bing-ai-keep-saying-its-named-is-sydney-heres-why/ 0 comments
- A List of Leaked System Prompts - Matt Rickard https://blog.matt-rickard.com/p/a-list-of-leaked-system-prompts 0 comments
- UK cybersecurity agency warns of chatbot ‘prompt injection’ attacks | Chatbots | The Guardian https://www.theguardian.com/technology/2023/aug/30/uk-cybersecurity-agency-warns-of-chatbot-prompt-injection-attacks 0 comments
- 7 methods to secure LLM apps from prompt injections and jailbreaks [Guest] https://artificialintelligencemadesimple.substack.com/p/mitigate-prompt-attacks 0 comments
- 7 methods to secure LLM apps from prompt injections and jailbreaks [Guest] http://mitigate-prompt-attacks.aitidbits.ai 0 comments
Related searches:
Search whole site: site:twitter.com
Search title: Kevin Liu on Twitter: "The entire prompt of Microsoft Bing Chat?! (Hi, Sydney.) https://t.co/ZNywWV9MNB" / Twitter
See how to search.