• medgremlin@midwest.social
    link
    fedilink
    arrow-up
    39
    ·
    2 days ago

    They don’t use the generative models for this. The AI’s that do this kind of work are trained on carefully curated data and have a very narrow scope that they are good at.

    • Ephera@lemmy.ml
      link
      fedilink
      English
      arrow-up
      11
      ·
      2 days ago

      Yeah, those models are referred to as “discriminative AI”. Basically, if you heard about “AI” from around 2018 until 2022, that’s what was meant.

      • medgremlin@midwest.social
        link
        fedilink
        arrow-up
        2
        ·
        1 day ago

        The discriminative AI’s are just really complex algorithms, and to my understanding, are not complete black-boxes. As someone who has a lot of medical problems I receive care for as well as being someone who will be a physician in about 10 months, I refuse to trust any black-box programming with my health or anyone else’s.

        Right now, the only legitimate use generative AI has in medicine is as a note-taker to ease the burden of documentation on providers. Their work is easily checked and corrected, and if your note-taking robot develops weird biases, you can delete it and start over. I don’t trust non-human things to actually make decisions.

        • sobchak@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          3 hours ago

          They are black boxes, and can even use the same NN architectures as the generative models (variations of transformers). They’re just not trained to be general-purpose all-in-one solutions, and have much more well-defined and constrained objectives, so it’s easier to evaluate how their performance may be in the real-world (unforeseen deficiencies, and unexpected failure modes are still a problem though).

    • Xaphanos@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      ·
      2 days ago

      That brings up a significant problem - there are widely different things that are called AI. My company’s customers are using AI for biochem and pharm research, protein folding, and other science stuff.

      • medgremlin@midwest.social
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        1 day ago

        I do have a tech background in addition to being a medical student and it really drives me bonkers that we’re calling these overgrown algorithms “AI”. The generative AI models I suppose are a little closer to earning the definition as they are black-box programs that develop themselves to a certain extent, but all of the reputable “AI” programs used in science and medicine are very carefully curated algorithms with specific rules and parameters that they follow.

      • jballs@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 day ago

        My company cut funding for traditional projects and has prioritized funding for AI projects. So now anything that involves any form of automation is “AI”.