“42% of AI answers were considered to lead to moderate or mild harm, and 22% to death or severe harm.” A damning research paper suggests that Bing / Microsoft Copilot AI medical advice may actually kill you.

https://www.windowscentral.com/microsoft/microsoft-copilot-ai-medical-advice-danger

7 Comments

  1. onceinawhile222 on

    Someone was pitching an AI product he was pleased to announce that it would produce a valid result with 87% accuracy. Didn’t sign, crazy.

  2. Yep, tried asking AI how to cure my covid, it told me to inject bleach. Horrible advice, absolutely awful.

  3. SoldierOf4Chan on

    A pattern of these sorts of studies is they always seem to ask the worst AI products least suited to what they’re asking for.

  4. microsoft copilot? is that even designed for giving medical advice? there are a dozen other ai chat and search engines that are specifically designed for that type of open ended question.  microsoft copilot?Â