- Scientists asked Bing Copilot - Microsoft's search engine and chatbot - questions about commonly prescribed drugs. In terms of potential harm to patients, 42% of AI answers were considered to lead to moderate or mild harm, and 22% to death or severe harm. https://www.scimex.org/newsfeed/dont-ditch-your-human-gp-for-dr-chatbot-quite-yet 59 comments technology
- Scientists asked Bing Copilot - Microsoft's search engine and chatbot - questions about commonly prescribed drugs. In terms of potential harm to patients, 42% of AI answers were considered to lead to moderate or mild harm, and 22% to death or severe harm. https://www.scimex.org/newsfeed/dont-ditch-your-human-gp-for-dr-chatbot-quite-yet 341 comments science
Linking pages
- "42% of AI answers were considered to lead to moderate or mild harm, and 22% to death or severe harm." A damning research paper suggests that Bing / Microsoft Copilot AI medical advice may actually kill you. | Windows Central https://www.windowscentral.com/microsoft/microsoft-copilot-ai-medical-advice-danger 69 comments
Related searches:
Search whole site: site:scimex.org
Search title: Don't ditch your human GP for Dr Chatbot quite yet - Scimex
See how to search.