Can chatbots replace human therapists? Some startups – and patients – claim that they can. But it is not exactly a established science.
A study noted that 80% of people who used OpenAi Chatpt for mental health advice consider it a good alternative to regular therapy, while report have found that chatbots can be effective in reducing certain symptoms related to depression and anxiety. On the other hand, it is well established The fact that the relationship between the therapist and the customer – human connection, in other words – is among the best predictors of success in the treatment of mental health.
Three entrepreneurs – Dustin Klebe, Lukas Wolf and Chris Aeberli – are in the Pro -Chatbot therapy camp. Their startup, Soniaoffers an “AI therapist” with which users can speak or send SMS via an iOS application on a range of subjects.
“To a certain extent, the construction of an AI therapist is like developing a drug, in the sense that we build a new technology rather than to recondition an existing drug,” said Aeberli, CEO of Sonia, in Techcrunch in an interview.
The three met in 2018 while studying IT in Eth Zürich and moved to the United States together to pursue higher education in the MIT. Shortly after graduating, they gathered to launch a startup that could encapsulate their common passion for evolutionary technology.
This startup has become Sonia.
Sonia uses a number of generative AI models to analyze what users say during “therapy sessions” in the application and respond to it. By applying techniques from cognitivo-behavioral therapy, the application, which invoices users $ 20 per month or $ 200 per year, gives “homework” aimed at provoking home ideas from conversations and visualizations designed to help identify the best stress factors.

Aeberli says that Sonia, who has not received approval from the FDA, can tackle problems ranging from depression, stress and anxiety to relationship problems and bad sleep. For more serious scenarios, like people who envisage violence or suicide, Sonia has “additional algorithms and models” to detect “emergency situations” and directly users to national hotlines, says Aeberli.
A little alarming, none of the founders of Sonia has an origins in psychology. But Aeberli says that the startup consults psychologists, recently hired a graduate of cognitive psychology and actively recruits a full -time clinical psychologist.
“It is important to emphasize that we do not consider human therapists or companies providing physical or virtual mental health care led by humans, as our competition,” Klebe said. “For each answer that Sonia generates, there are about seven additional linguistic model calls in the background to analyze the situation from several different therapeutic perspectives in order to adjust, optimize and personalize the therapeutic approach chosen by Sonia.”
What about confidentiality? Can users be assured that their data is not kept in a vulnerable cloud Or used to train Sonia models without their knowledge?
Aeberli says that Sonia is committed to storing only the “absolute minimum” quantity of personal information to administer therapy: the age and name of a user. However, he did not address where, how or for how long Sonia stores conversation data.

Sonia, which has around 8,000 users and $ 3.35 million in investors, including Y -Formator, MoDon, Rebel Fund and SBXI, is in talks with nameless mental health organizations to provide Sonia as a resource via their online portals. So far, Sonia’s opinions on the App Store are quite positive, with several users noting that they find it easier to speak with the chatbot of their problems than a human therapist.
But is it a good thing?
Today’s chatbot technology is limited in the quality of the advice it can give – and it may not take up more subtle panels indicating a problem, such as an anorexic person asking how to lose weight. (Sonia would not even know the weight of the person.)
Chatbots’ responses are also colored with biases – often Western biases are reflected in their training data. Consequently, they are more likely to miss cultural and linguistic differences in the way a person expresses mental illnesses, especially if English is the second language of this person. (Sonia only supports English.)
In the worst case, chatbots come out of the rails. Last yearThe National Eating Disorders Association was criticized to replace humans with a chatbot, Tessa, who gave weight loss advice that were triggered from people with food disorders.
Klebe stressed that Sonia is not trying to replace human therapists.

“We are building a solution for millions of people who have trouble with their mental health but cannot (or do not want to) access a human therapist,” said Wolf. “We aim to fill the gigantic gap between demand and supply.”
There is certainly a gap – both in terms of professionals’ relationship to patients and the cost of treatments in relation to what most patients can afford. More than half of the United States do not have adequate geographic access to mental care, according to to a recent government report. And a recent investigation found that 42% of us adults with a mental health problem were unable to receive care because they could not afford it.
An article in Scientific American is talking about therapy applications that are aimed at “worried” or people who can afford therapy and applications, and not isolated people who could be the most at risk but who do not know how to ask for help. At $ 20 per month, Sonia is not really cheap – but Aeberli maintains that it is cheaper than a typical therapy meeting.
“It is much easier to start using Sonia than to see a human therapist, which involves finding a therapist, being on the waiting list for four months, going there at a defined time and paying $ 200,” he said. “Sonia has already seen more patients than a human therapist would not see in all their careers.”
I only hope that the founders of Sonia will remain transparent on the problems that the application can and cannot solve when they build it.