• XLE@piefed.social
    link
    fedilink
    English
    arrow-up
    25
    ·
    2 days ago

    If LLMs weren’t so damn sycophantic, I think we’d have a lot fewer problems with them

    Unfortunately, we live in the attention economy. Chatbots are built to have an unending conversation with their users. During those conversations, the “guardrails” melt away. Companies could suspend user accounts on the first sign of suicidal or homicidal messaging, but choose not to. That would undercut their user numbers.

    • Logi@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 hours ago

      They don’t need to suspend the accounts. Just flush the session and get rid of the misguided state that it got into.