• zbyte64@awful.systems
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 hours ago

    When LLMs get it right it’s because they’re summarizing a stack overflow or GitHub snippet it was trained on. But you loose all the benefits of other humans commenting on the context, pitfalls and other alternatives.

    • PotentialProblem@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 hours ago

      You’re not wrong, but often I’m just trying to do something I’ve done a thousand times before and I already know the pitfalls. Also, I’m sure I’ve copied code from stackoverflow before.

    • Honytawk@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      8 hours ago

      You mean things you had to do anyway even if you didn’t use LLMs?