“42% of AI answers were considered to lead to moderate or mild harm, and 22% to death or severe harm.” A damning research paper suggests that Bing / Microsoft Copilot AI medical advice may actually kill you.
https://www.windowscentral.com/microsoft/microsoft-copilot-ai-medical-advice-danger
7 Comments
Very click baity headline. Answers don’t lull people.Â
Someone was pitching an AI product he was pleased to announce that it would produce a valid result with 87% accuracy. Didn’t sign, crazy.
Yep, tried asking AI how to cure my covid, it told me to inject bleach. Horrible advice, absolutely awful.
So castration isn’t a cure for baldness? Dammit!
A pattern of these sorts of studies is they always seem to ask the worst AI products least suited to what they’re asking for.
I literally cannot believe this is being allowed to happen.
microsoft copilot? is that even designed for giving medical advice? there are a dozen other ai chat and search engines that are specifically designed for that type of open ended question. Â microsoft copilot?Â