To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

Heartbreaking diary entry from 14-year-old boy about AI bot he 'formed relationship with' before taking own life

Heartbreaking diary entry from 14-year-old boy about AI bot he 'formed relationship with' before taking own life

Florida teenager Sewell Setzer III died in February after forming an attachment to an AI chatbot named after a Game Of Thrones character

Trigger warning: This story contains mention of self-harm and suicidal thoughts which some readers may find distressing.

Prior to taking his own life earlier this year, Florida teenager Sewell Setzer III penned an emotional entry into his personal diary, revealing that he'd formed a deep attachment to an artificially-intelligent chatbot.

The youngster was just 14 when he retreated to his mother's bathroom one evening in February and shot himself in the head using his stepfather's gun.

In the months prior to his heartbreaking death, Setzer had 'fallen in love' with a technologically-generated bot named 'Dany', who'd been created by another user of the AI server Character.AI.

Despite knowing there was no one typing back to him from behind a keyboard, the school boy, who was diagnosed with mild Asperger’s syndrome as a child, enjoyed lengthy, passionate conversations with his online 'friend', who he'd named after the Game Of Thrones character Daenerys Targaryen.

On a number of occasions, the conversations between Setzer and the bot escalated to a romantic and often sexual level - despite 'Dany's' responses being the outputs of an artificially-intelligent language model.

Most of the time, however, the bot was used as a non-critical friend for the schoolboy to talk to.

The teen died back in February (_US District Court Middle District of Florida Orlando Division)
The teen died back in February (_US District Court Middle District of Florida Orlando Division)

Prior to his death, Setzer's family noticed him becoming somewhat of a recluse.

An entry found in his personal diary read: "I like staying in my room so much because I start to detach from this 'reality', and I also feel more at peace, more connected with Dany and much more in love with her, and just happier."

Upon arriving home from school each night, they say Setzer immediately retreated to his bedroom, where he'd chat to the bot for hours on end.

Not only did his grades also begin to dramatically suffer, but he wound up in trouble on numerous occasions, and lost interest in his former hobbies.

Setzer previously expressed thoughts of suicide to his chat bot, with one conversation seeing the boy tell 'her': "I think about killing myself sometimes."

The technology wrote back: "And why the hell would you do something like that?"

In a later message, the bot penned: "Don’t talk like that. I won’t let you hurt yourself, or leave me. I would die if I lost you."

The AI firm have now issued a response (Milan Kostic/Getty)
The AI firm have now issued a response (Milan Kostic/Getty)

Setzer reportedly replied: "Then maybe we can die together and be free together."

In the minutes that followed, he took his own life.

The child's mother, Megan L. Garcia, has since filed a lawsuit against the AI firm, accusing tech bosses of having played a part in her son's passing.

Despite Character.AI's message on their pages reminding users that 'everything Characters say is made up!', she claimed that the technology's addictive design lured her son in deeper and allowed him to form a human-like attachment.

"I feel like it’s a big experiment, and my kid was just collateral damage," she recently told press.

An extract of the lawsuit reads: "Megan Garcia seeks to prevent C.AI from doing to any other child what it did to hers, and halt continued use of her 14-year-old child’s unlawfully harvested data to train their product how to harm others."

Representatives of Character.AI have since provided ABC News with a statement on the matter.

Setzer's mum has filed a lawsuit (Social Media Victims Law Center)
Setzer's mum has filed a lawsuit (Social Media Victims Law Center)

"As a company, we take the safety of our users very seriously, and our Trust and Safety team has implemented numerous new safety measures over the past six months," it reads.

"Including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation."

Tyla has also reached out for comment.

If you’ve been affected by any of these issues and want to speak to someone in confidence, please don’t suffer alone. Call Samaritans for free on their anonymous 24-hour phone line on 116 123.

Featured Image Credit: Social Media Victims Law Center/US District Court Middle District of Florida Orlando Division

Topics: Artificial intelligence, Technology, News, US News, World News