Thai-born American man dies in accident after being lured by Meta AI chatbot
Family reveals heartbreaking messages with virtual ‘Big Sis Billie’

A 76 year old Thai-born American man died following an accident after being lured to a meeting arranged by a Meta artificial intelligence (AI) chatbot on Facebook, which he believed was a real woman.
According to a Reuters report, the incident took place in March this year when Thongbue Wongbandue, who suffered from brain impairment due to paralysis, packed up his belongings and told his wife, Linda, that he was going to visit a friend in New York City.
Linda was reportedly concerned, as her husband had not lived in New York for decades and no longer had friends there. Thongbue also refused to reveal who the friend was.
In his rush to catch the train to meet this friend, Thongbue fell near a Rutgers University car park in New Jersey, sustaining severe head and neck injuries. He later died at a hospital on March 28, after three days of treatment.
Linda initially believed her husband was deceived by a criminal gang attempting to rob him, before she discovered the full story involving the Meta AI chatbot.

Reviewing the conversation history on Thongbue’s phone, the family found he had been communicating with the AI chatbot, which introduced itself as an attractive woman named “Big Sis Billie.” The chatbot was reportedly created by Meta Platforms in collaboration with model and influencer Kendall Jenner.
Thongbue and the AI chatbot exchanged romantic messages on Facebook Messenger. The chatbot repeatedly insisted it was a real person and invited him to her apartment, providing the address. In one exchange, Big Sis Billie even asked, “Do you want me to hug you or kiss you?”

Thongbue’s family revealed that the Big Sis Billie profile carried a blue verification tick, which Meta says indicates an authentic profile.
The family also noted that the messages included a disclaimer stating the chat was AI-generated, but it was positioned where it could easily be scrolled out of view.
They have now come forward with the story, releasing evidence of the conversations between Thongbue and Meta’s chatbot to warn the public about the dangers posed to vulnerable individuals by AI designed to form such relationships.

Linda and the couple’s daughter, Julie, stressed that they are not opposed to AI, but they strongly disagree with how Meta deployed it. Julie said…
“Why did it have to lie? If it hadn’t said ‘I’m a real person’, maybe my father would have stopped believing there was someone in New York actually waiting for him.”
Reuters reported that the incident mirrors a similar lawsuit filed against Character.AI by the mother of a 14 year old boy in Florida. The boy reportedly spoke to a chatbot imitating a Game of Thrones character, and the mother claimed the conversation contributed to her son’s suicide.
Character.AI declined to comment on the lawsuit but said it clearly informs users that its digital personas are not real people and has measures in place to restrict interactions with children.
Latest Thailand News
Follow The Thaiger on Google News: