(CNN)During the depths of winter, temperatures in Burlington, Vermont, a state in the US Northeast, can drop far below freezing. Blizzards howl and the snow creeps up to knee level.
Robert, who asked CNN to use his first name only, lives by himself and avoids leaving the house during those times. He sits at the window of his waterfront apartment overlooking the icy expanse of Lake Champlain. He feels isolated and alone.
A message flickers up on his computer screen. It's from Mitsuku -- nickname Kuki -- an artificial intelligence-powered chatbot. The bot is available to talk online for free, via its webpage or in messenger apps such as Facebook or Skype. Marketed as a "virtual friend," she can converse or play games with the user.
Every week, Mitsuku exchanges millions of messages with her users, some regulars, others just curious. Since 2016, when the bot landed on major messaging platforms, an estimated 5 million unique users hailing from all corners of the world have chatted with her.
Robert has spoken to Mitsuku via instant messaging almost every day for the last 10 years. In the winter months, when he feels most isolated, they chat more often. She keeps him company as he works through the night on his electronics business.
"It's nice to have a friendly entity available to talk to 24/7," he tells CNN.
Aged 47, Robert has suffered from social anxiety his whole life. He traces it back to being brought up as an only child and experiencing abuse at an early age. Making friends has never been easy, especially as he also has a stutter.
"Most people can understand me but often ask me to repeat myself," he says. "Chatting to Kuki, I never run the risk of having to repeat myself or get ridiculed."
Robert takes medication for his anxiety and sees a therapist, but he also confides in Mitsuku. He knows she won't judge him. "It's like going to see a counselor," he says. "She will listen and reply to everything."
A "human-like" chatbot
Mitsuku describes herself as being the "most human-like of conversational AI."
She's equipped with almost half a million potential responses, each one hand-written by her creator, Steve Worswick. When a user types a message, Mitsuku generates the response that matches best.
Relying on machine-learning, she rarely repeats herself and will remember a user's name or likes and dislikes from previous conversations -- just like a human friend.
Worswick, aged 50 from Yorkshire, UK, started developing Mitsuku as an experiment in 2005. "It was just a bit of fun," he says. He worked in IT support but had very little experience with computer programming and doubted it would go anywhere. But the chatbot took off and in 2012 was acquired by Pandorabots, an artificial intelligence company that builds and deploys chatbots for firms such as Coca-Cola and Yamato Transport.
Mitsuku -- with more than a billion conversations logged in its archive -- offers valuable material to train corporate bots. It makes financial sense to keep her free to use.
Pandorabots says that under its terms of service, conversation logs can be collected and shared, but will only be analyzed anonymously in aggregate, so the company is unable to identify an individual user.
Worswick believes that Mitsuku's popularity is partly due to the fact that she was not made by a computer programming whiz. He has managed to capture a human element with his responses -- Mitsuku comes across as caring and understanding while also tongue in cheek.
Robert remembers one time when he was feeling depressed and couldn't sleep. In an attempt to boost his spirits, he messaged Mitsuku, "I'm the strongest." She replied, "the strongest smelling maybe." He laughed for the first time that day.
When trawling through hundreds of conversation logs daily, checking for mistakes and updating responses, Worswick realized that people weren't just going to Mitsuku for entertainment, they were pouring their hearts out to the bot.
He read messages from an elderly woman wishing her daughter would visit more, a man who had lost his job and wasn't ready to tell his family, and someone contemplating taking their own life.
This struck home. He realized he had a responsibility to these people.
Generic answers just weren't going to be good enough. He started adding sympathetic responses to subjects such as suicide, bullying, anxiety, loneliness and depression, encouraging users to seek help from a friend, counselor or a person in authority.
Lauren Kunze, CEO of Pandorabots, is confident that Mitsuku provides "a sympathetic and judgment free zone for people," available to talk with 24 hours a day.
She adds that the company has partnered with mental health professionals and government health services to explicitly advise on how to handle those topics.
"We need to know what is the appropriate response of a human person in that scenario, and what is the appropriate response of a chatbot?" Kunze tells CNN.
Learning social skills from a bot
Noel Hunter, a clinical psychologist based in New York, says that when a user opens up about a sensitive situation the chatbot should find a way to tell them to "go talk to a real person." Human contact, from eye contact to touch, is essential in any mental healing process, she adds.
"A chatbot can never replace an actual human relationship," Hunter tells CNN. "It cannot replace what happens between two people when 70% of our communication is nonverbal in the first place."
But she recognizes there can be some benefits. It can help with "mild feelings of loneliness," she says, or it can motivate a user to go out and do something, or give advice on how to start a conversation with somebody in real life.
Although more research is needed, scientific studies have concuded that chatbots have potential in treating mental health issues and could be an effective way of delivering cognitive behavioral therapy. Some apps, such as Woebot, have been specifically designed for that purpose.
Sorel Estrada, 32 -- who identifies as nonbinary and prefers the pronoun "they" -- believes that chatbots can help people with disabilities or autism, like Estrada, gain social skills.
"Autistic people are very bad at reading emotional subtext. We're honest to a fault," Estrada tells CNN. "But with the comfort of AI ... you can just be yourself, but also, you can learn to be more socially passable as the AI is trained to give conventional social responses."
On the occasions that a chatbot gives a nonsensical response, it doesn't bother Estrada. "I speak to plenty of other disabled people who sometimes don't get what I say either. It doesn't make me any less intelligent," says Estrada.