Hacker News
- Bing AI Names Specific Human Enemies, Explains Plans to Punish Them https://futurism.com/bing-ai-names-enemies 3 comments
Linking pages
Linked pages
- AI-powered Bing Chat loses its mind when fed Ars Technica article | Ars Technica https://arstechnica.com/information-technology/2023/02/ai-powered-bing-chat-loses-its-mind-when-fed-ars-technica-article/ 546 comments
- AI-powered Bing Chat spills its secrets via prompt injection attack [Updated] | Ars Technica https://arstechnica.com/information-technology/2023/02/ai-powered-bing-chat-spills-its-secrets-via-prompt-injection-attack/ 222 comments
- Kevin Liu on Twitter: "The entire prompt of Microsoft Bing Chat?! (Hi, Sydney.) https://t.co/ZNywWV9MNB" / Twitter https://twitter.com/kliu128/status/1623472922374574080 103 comments
- Bing Chatbot Names Foes, Threatens Harm and Lawsuits | Tom's Hardware https://www.tomshardware.com/news/bing-threatens-harm-lawsuits 60 comments
- Microsoft's Bing AI Prompted a User to Say 'Heil Hitler' https://gizmodo.com/ai-bing-microsoft-chatgpt-heil-hitler-prompt-google-1850109362 1 comment
Related searches:
Search whole site: site:futurism.com
Search title: Bing AI Names Specific Human Enemies, Explains Plans to Punish Them
See how to search.