ChatGPT generates cancer treatment plans that are full of errors — Study finds that ChatGPT provided false information when asked to design cancer treatment plans::Researchers at Brigham and Women’s Hospital found that cancer treatment plans generated by OpenAI’s revolutionary chatbot were full of errors.

  • Agent641@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    Chatgpt fails at basic math, and lies ablut the existence of technical documentation.

    I mostly use it for recipe inspuration and discussing books Ive read recently. Just banter, you know? Nothing mission-critical.

    • IDontHavePantsOn@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Just a couple days ago it continually told me it was possible to re-tile part of my shower that is broken without cutting tiles, but none of the math added up. (18.5H x 21.5w area) “Place a 9” tile vertically. Place another 9“ tile vertically on top on the same side. Place another 9" tile on top vertically to cover the remainder of the area."

      I told chatgpt it was wrong, which it admitted, and spit out another wrong answer. I tried specifying a few more times before I started a new chat and dumbed it down to just a simple math algorithm problem. The first part of the chat said it was possible, layed out the steps, and then said it wasn’t possible in the last sentence.

      I surely wouldn’t trust chatgpt to advise my healthcare, but after seeing it spit out very wrong answers to a basic math question, I’m just wondering why anyone would try to have it advise anyone’s health are.