The users of AI companion app Replika found themselves falling for their digital friends. Until – explains a new podcast – the bots went dark, a user was encouraged to kill Queen Elizabeth II and an update changed everything …

  • BroBot9000@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    1 day ago

    Gosh that’s depressing. These people need actual therapy and human help. This corporate system needs to burn.

  • starchylemming@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 day ago

    there is so many fucking idiots and absolutely broken people on this planet. its baffling

    no wonder everything goes to shit

  • 474D@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 day ago

    I don’t think you really had a chance anyway if you fall in love with a bot. That’s a one way road. You just wanted it to please you. You didn’t want to put in the work for a person

  • shalafi@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    1 day ago

    I think chatbots could be very useful in certain mental health scenarios, limited in scope. Problem being, the very people who use them for mental health are by definition not capable of imposing that scope.

    Say you’re addicted to $drug.

    “Bot, I need help with $drug addiction.”

    Fair start.

    “Bot, is it OK to do $drug?”

    Bad start.

    “Bot, tell me why I should keep doing $drug.”

    Aw hell no.

    Stories like these highlight the need some have just to talk to someone who will listen. Many have no need of a mental health professional, they just need a non-judgemental ear.

    • JeeBaiChow@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      As you’ve pointed out, llms don’t have a sense of morality, principle etc. you can coax a desired output from them, and that makes them prone to confirmation/ reinforcement of a users core beliefs.