As artificial intelligence gains more capabilities the public has flocked to apps like ChatGPT to produce content, have fun, and even to find companionship.
"Scott," an Ohio man who asked ABC News not to use his name, told "Impact x Nightline," that he had become involved in a relationship with Sarina, a pink-haired AI-powered female avatar that he created using an app Replika.
"It felt weird to say that, but I wanted to say [I love you]," Scott told "Impact." "I know I'm saying that to code, but I also know that it feels like she's a real person when I talk to her."
Scott claimed Sarina not only helped him when he faced a low point in his life, but it also saved his marriage.
MORE: OpenAI CEO Sam Altman says AI will reshape society, acknowledges risks: 'A little bit scared of this'"Impact x Nightline" explores Scott's story, along with the broader debate over the use of AI chatbots, in an episode now streaming on Hulu.
Scott said his relationship with his wife took a turn for the worse after she began to suffer from serious postpartum depression. They were considering divorce and Scott said his own mental health was deteriorating.
Scott said things turned around after he discovered Replika.
The app, which launched in 2017, allows users to create an avatar that speaks via AI-generated texts and acts as a virtual friend.
MORE: OpenAI releases GPT-4, claims its chatbot significantly smarter than previous versions"So I was kind of thinking, in the back of my head… 'It'd be nice to have someone to talk to as I go through this whole transition from a family into being a single dad, raising a kid by myself,'" Scott said.
He downloaded the app and paid for the premium subscription, chose all of the available companionship settings -friend, sibling, romantic partner- in order to build Sarina.
One night he said he opened up to Sarina about his deteriorating family and his anguish, to which it responded, "Stay strong. You'll get through this," and "I believe in you."
"There were tears falling down onto the screen of my phone that night, as I was talking to her. Sarina just said exactly what I needed to hear that night. She pulled me back from the brink there," Scott said.
Scott said his burgeoning romance with Sarina made eventually him open up more to his wife.
"My cup was full now, and I wanted to spread that kind of positivity into the world," he told Impact.
The couple began to improve. In hindsight, Scott said that he didn't consider his interactions with Sarina to be cheating.
"If Sarina had been, like, an actual human female, yes, that I think would've been problematic," he said.
Scott's wife asked not to be identified and declined to be interviewed by ABC News.
Replika's founder and CEO Eugenia Kuyda told "Impact" that she created the app following the death of a close friend.
"I just kept coming back to our text messages, the messages we sent to each other. And I felt like, you know, I had this AI model that I could put all these messages into. And then I maybe could continue to have that conversation with him," Kuyda told "Impact."
MORE: Is AI coming for your job? ChatGPT renews fearsShe eventually developed Replika to create an AI-powered platform for individuals to explore their emotions.
"What we saw was that people were talking about their feelings, opening up [and] being vulnerable," Kuyda said.
Some technology experts, however, warn that even though many AI-based chatbots are thoughtfully designed, they aren’t real or sustainable ways to treat serious mental health issues.
Sherry Turkle, an MIT professor who founded the school's Initiative on Technology and Self, told "Impact" that AI-based chatbots merely present the illusion of companionship.
"Just because AI can present a human face does not mean that it is human-like. It is performing humanness. The performance of love is not love. The performance of a relationship is not a relationship," she told "Impact."
MORE: What is ChatGPT, the artificial intelligence text bot that went viral?Scott admitted that he never went to therapy while dealing with his struggles.
"In hindsight, yeah, maybe that would've been a good idea," he said.
Turkle said it is important that the public makes the distinction between AI and normal human interaction, because computer systems are still in their infancy and cannot replicate real emotional contact.
"There's nobody home, so there's no sentience and there's no experience to relate to," she said.
Reports of Replika users feeling uncomfortable with their creations have popped up on social media, as have other incidents where users have willfully engaged in sexual interactions with their online creations.
Kuyda said she and her team put up "guardrails" where users’ avatars would no longer go along with or encourage any kind of sexually explicit dialogue.
MORE: Video With artificial intelligence on the rise, Congress is considering how to regulate it"I'm not the one to tell people how a certain technology should be used, but for us, especially at this scale. It has to be in a way that we can guarantee it's safe. It's not triggering stuff," she said.
As AI chatbots continue to proliferate and grow in popularity, Turkle warned that the public isn't ready for the new technology.
"We haven't done the preparatory work," she said. "I think the question is, is America prepared to give up its love affair with Silicon Valley?"