The woman said she allowed her to reddenstive and invited him to visit. He died before he learned what he actually talked about

You need to know

  • 76 -year -old New Jersey’s father died earlier this year after

  • In reality he really talked to Ai chat film Facebook

  • After its fall, Wongbandue was left to the brain; Now his family speaks

Earlier this year, a 76-year-old New Jersey man had severely injured his head and neck after falling to catch the train to New York to meet a beautiful young woman who invited him to visit-he thought he thought.

In reality, the man was unconsciously fascinated by Meta Chatbot, and his family said in a detailed new statement by Reuters.

Three days later, when he was supported by life after his fall, trying to “meet” in real life, the man was dead.

A man of two adult children and the father of two adult children Thongbue Bue WongBandue 2017 Received a stroke that caused him cognitive, so he demanded that he leave the chef’s career and mostly restrict him to communicate with his friends through social media, says Reuters.

March 25 His wife, Linda, was surprised when she packed her suitcase and told her she had gone to see her in a friend.

Linda, who was afraid he would be robbed, told Reuters Reuters that he was trying to talk to him from the trip, as did their daughter Julie.

Later, Linda hid her phone and even called the couple’s son to the local police to try to stop the tour. Although the authorities said they could do nothing, they said Linda had convinced Wongbandue to pick up Apple Airtag.

After he left that night, Julie said the whole family was watching Airtag showed that soon after 9:15. He stopped at the parking lot of Rutgers University

Then the tag location was suddenly updated – by breaking the local hospital in the emergency department. As it turned out, Wongbitue fell in the new Brunsvike, NJ, and did not breathe when he was reached by emergency services.

He survived, but Whether the brain was dead. Three days later, on March 28, it was removed from life support.

A local medical expert when he approached the comments did not provide any additional information or a copy of his postmortem exam.

His family told Reuters that they only discovered his relationship with Chatbot, which uses generative artificial intelligence to imitate human language and behavior -when they visited his phone after its fall.

In the transcript of Reuters’ communication, Wongbandue’s interaction with the conversation research began with an obvious spelling error using Facebook Messenger – and although it seemed to express the excitement of a bot called Big Sis Billie, he never offered he searched for a romantic connection and explained that he experienced an insult.

“There was no way to have a desire to get involved in a romantic role or to establish intimate physical contacts,” Reuters said.

However, the robot often reacted to his messages with funny feelings and hearts caught on the end of flirting answers.

For example, in one exchange, Wongbandue tells Billie that she should come to America and he can show her a “wonderful time you will never forget,” to which she replies, “BU, are you shaking me! Is it a sister sleeping, do you mention that something more is happening here?”

According to a transcript, the robot was also marked with both AI’s denial and a blue crow, which is often a symbol indicating that the online profile was checked as a real person.

Billie claimed she was real.

Reuters described Billie as a newer robot iteration, previously developed in collaboration with Kendall Jenner, although the latest version conveys contacts only with the first project.

The original robot was unveiled in 2023. It was deleted less than a year later, Reuters said.

The subsequent version of Billie used a similar name as the original and similar promise to be a big sister along with the same dialogue opening line, but without Jenner Avatar or similarity.

When asked, the specificity of the Billie Chatbot origin, the representative tells the people in his report: “This AI character is not kendall Jenner and does not think he is Kendall Jenner.” (Jenner spokesman did not respond to a request to comment.)

At one point in WongBandue, he announced that he had “feelings” for “not just his sister of love” and provided him with a address (and even a door code) to visit him with an invitation to visit him.

When WongBandue expressed the hope that it really existed, the robot replied, “I’m screaming with excitement, I’m sure, but I want me to send you a selfie to prove that I am a girl who crushes you?”

Although his wife Linda reacted with confusion when she first saw their conversation, their daughter immediately recognized that her father was talking about the development of the conversation.

In recent years, such technologies have become more and more popular as more and more people are using AI robots for many daily tasks, answering daily questions and even friendship and advice.

Usually, when talking about the company’s content risk standards, Meta’s representative says to the people: “We have a clear policy, what answers can be offered by AI characters, and that policy measures prohibit the content that sexualizes children and sexualized role between adults and minors.”

Never miss out on history-register for a free People day newsletter to update what people can offer, from Celebrity News to convincing people’s interest stories.

“Separately from politics, there are hundreds of examples, notes and comments that reflect teams facing different hypothetical scenarios,” the spokesman continues. “The examples and notes discussed were and are false and incompatible with our policy, and have been removed.”

Speaking to Reuters, members of the WongBandue family said they had problems with Meta’s use of chat programs.

“I understand that I am trying to attract the consumer’s attention, maybe to sell them,” WongBandue’s daughter Julie told Reuters. “But the bot says ‘Come, visit me’ is crazy.

“When I survive the conversation, it seems that Billie gives him what he wants to hear,” she added. “What is good, but why did she have to lie? If he hadn’t answered ‘I’m sure’, it probably discouraged him from believing that something was waiting for him in New York.”

“This romantic thing,” Linda said, “what right do they have to provide in social media?”

Read the original article about people

Leave a Comment