• Ephera@lemmy.ml
    link
    fedilink
    English
    arrow-up
    20
    ·
    2 days ago

    Yeah, there were also several stories where the AI just detected that all the pictures of the illness had e.g. a ruler in them, whereas the control pictures did not. It’s easy to produce impressive results when your methodology sucks. And unfortunately, those results will get reported on before peer reviews are in and before others have attempted to reproduce the results.

    • DarkSirrush@lemmy.ca
      link
      fedilink
      arrow-up
      10
      ·
      2 days ago

      That reminds me, pretty sure at least one of these ai medical tests it was reading metadata that included the diagnosis on the input image.