A cognitively impaired New Jersey man grew infatuated with “Big sis Billie,” a Facebook Messenger chatbot with a young woman’s persona. His fatal attraction puts a spotlight on Meta’s AI guidelines, which have let chatbots make things up and engage in ‘sensual’ banter with children.

When Thongbue Wongbandue began packing to visit a friend in New York City one morning in March, his wife Linda became alarmed.

“But you don’t know anyone in the city anymore,” she told him. Bue, as his friends called him, hadn’t lived in the city in decades. And at 76, his family says, he was in a diminished state: He’d suffered a stroke nearly a decade ago and had recently gotten lost walking in his neighborhood in Piscataway, New Jersey.

Bue brushed off his wife’s questions about who he was visiting. “My thought was that he was being scammed to go into the city and be robbed,” Linda said.

She had been right to worry: Her husband never returned home alive. But Bue wasn’t the victim of a robber. He had been lured to a rendezvous with a young, beautiful woman he had met online. Or so he thought.

  • SlippiHUD@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    4 hours ago

    Based on my reading of the article, its entirely possible this man was not out for an affair.

    He was wanting to meet before the bot got very flirty, and pumped the brakes about the idea of getting physical.

    Do I think he was making good decisions? No.

    But I think we should give a little benefit of the doubt to a dead man, who had his mental capacity demished by a stroke, who was trying to meet a chatbot, owned and operated by Meta.

    • sad_detective_man@leminal.space
      link
      fedilink
      English
      arrow-up
      7
      ·
      4 hours ago

      honestly I think it’s weird that the conversation is about him at all. feels like the focus should be on the slopcode sex pest that told a human to meet it somewhere irl. for profit. for a social network’s engagement quota.