The judge’s order will allow the wrongful death lawsuit to proceed, in what legal experts say is among the latest constitutional tests of artificial intelligence.

The suit was filed by a mother from Florida, Megan Garcia, who alleges that her 14-year-old son Sewell Setzer III fell victim to a Character . AI chatbot that pulled him into what she described as an emotionally and sexually abusive relationship that led to his suicide.

Meetali Jain of the Tech Justice Law Project, one of the attorneys for Garcia, said the judge’s order sends a message that Silicon Valley “needs to stop and think and impose guardrails before it launches products to market.”

  • Archangel@lemm.ee
    link
    fedilink
    arrow-up
    65
    arrow-down
    4
    ·
    2 days ago

    Free speech doesn’t protect you from encouraging someone to kill themselves. You can, and should, be held responsible for their death, if you are actively telling someone to end their own life…and they do it.

    And if that’s what these fucks are selling to teenagers in the form of chat it’s, then they also need to be held accountable for what their products are doing.

    • KelvarIW@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      36
      arrow-down
      3
      ·
      2 days ago

      The chatbot didn’t even “actively [tell] someone to end their own life”. Did you read the original transcript? Here’s an excerpt from an Associated Press Article.

      “I promise I will come home to you. I love you so much, Dany,” Sewell told the chatbot.

      “I love you too,” the bot replied. “Please come home to me as soon as possible, my love.”

      “What if I told you I could come home right now?” he asked.

      “Please do, my sweet king,” the bot messaged back.

      Just seconds after the Character.AI bot told him to “come home,” the teen shot himself, according to the lawsuit, filed this week by Sewell’s mother, Megan Garcia, of Orlando, against Character Technologies Inc.

      • thedruid@lemmy.world
        link
        fedilink
        arrow-up
        15
        ·
        2 days ago

        Yeah, I’m all for shuttering see things until we get them right, but this is a tragic case of a devastated mother reaching for answers, not a free speech issue.

        It’s heart breaking

        • Anomalocaris@lemm.ee
          link
          fedilink
          arrow-up
          13
          arrow-down
          2
          ·
          1 day ago

          isn’t free speech the bs defense that the company used. that company is definitely guilty to some degree.

    • lmmarsano@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      1 day ago

      encouraging someone to kill themselves

      I’m pretty sure that can be ignored without harm. Whether someone elects to kill themselves or not is up to them.

      • Archangel@lemm.ee
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        2 days ago

        I’m a little confused by your comment. Do you think I’m blaming the kid? Or do you think it’s ok to talk someone into killing themselves, because the victim’s personal autonomy absolves them of responsibility?