April 12, 2026
How AI Can Alleviate Ukraine’s Therapist Shortage

A walk through any city in Ukraine these days can bring you face-to-face with a maimed veteran – hobbling on crutches, missing limbs, or visibly disfigured. But the wounds and traumas are often hidden. Sometimes it’s a simple tic, or a distant gaze. You know something is off, but it can’t quite be measured. All you know is the probable cause of what triggered it: the war.

For many Ukrainians dealing with trauma, the presence of those visibly disabled will not allow them to express their own difficulties. “It’s all relative,” says Nastasya Shalina, the widow of a soldier known to have died near Bakhmut in 2023, but whose body has never been returned. “We live in a stoic culture to begin with. People suffer their troubles quietly. So when you see someone who has lost even more than you have, it’s almost unseemly to feel like a victim.”

One of the biggest problems in trying to help those suffering from the traumas generated by war is a severe dearth of mental help professionals. To remedy that situation, two women have teamed up to harness the power of artificial intelligence (AI) to fill the huge gaps in face-to-face therapy.

Mitzi Perdue is an author, businesswoman, and philanthropist who has been helping Ukraine since the beginning of the full-scale invasion. In 2022 she had written a story for Psychology Today about human trafficking cartels descending on Ukraine. The Kyiv chief of police read the article and invited Perdue to come to Kyiv and see more.

26 Ukrainian Journalists Still Held by Russia in 2025

Other Topics of Interest

26 Ukrainian Journalists Still Held by Russia in 2025

Amid the ongoing war, reporters imprisoned in Russia often face torture and death. In the meantime, advocacy groups continue to push for their release.

While in Kyiv she interviewed a 14-year-old girl, Daria, who changed her life. “Daria described what it was like when she was in a car in Bucha, with her stepmother and father, and a random Russian for no reason takes out his machine gun and kills her parents,” Perdue says. “What could be more traumatic for a 14-year-old?”

She naively asked the policewoman-translator accompanying her whether Daria would get any counseling. The answer was stark: There are millions of people in Ukraine with trauma equally as grave as Daria’s; there is no possibility in a war-torn country of having enough counselors to help so many like her.

So Perdue got the idea that AI might somehow be used to make up for the shortfall in counselors. She tracked down Clara Kaluderovic, an American technology entrepreneur whose family on her mother’s side is from Ukraine.

Kaluderovic recounts how she and Perdue came up with the project: “She said, ‘You’re Ukrainian, you know tech, there’s this massive problem in Ukraine with mental health.’ So, we ended up sitting down over a coffee at Mitzi’s New York place and just brainstormed. What ideally could you do if you use AI and approach it like a private sector problem, but do it as a not-for-profit?”

The result of Kaluderovic and Perdue’s brainstorming session was Mental Help Global.

How AI can help heal – and how to prevent it from harming

Starting from the premise that there simply are not enough trained counselors to help those suffering from PTSD (post-traumatic stress disorder), AI would be used, as in other situations, to lighten the workload and offer preliminary advice and direction. This would hopefully supplant the all-too-common attempts at self-medication through alcohol, drugs, and other addiction-inducing palliatives.

But AI, still in its rudimentary phase of development, poses other problems. Open AI, the maker of ChatGPT, is being sued after a mother was killed by her son. The family claims that the son’s pre-existing paranoia was exacerbated by his frequent interaction with ChatGPT.

“Obviously, setting up guardrails is essential,” says Perdue.

Kaluderovic points out that most AI models are trained to be a “yes-person” assistant: “Everything you say, good and bad, it is theoretically going to agree with, or at least kind of encouragingly work with you.”

But for therapy, that is an inherently bad approach. “You can have people who are feeling very depressed, or they’re feeling that they have been treated badly at work or at home or have a paranoid idea. The productivity model right now at the base is trained to agree with that.”

To establish guardrails that would prevent AI programs from encouraging destructive behavior would require anticipating how the patient might find workarounds to AI models’ already built-in suicide protocols. This would require training the AI to recognize certain “tricks.”

Kaluderovic gives an example: Instead of saying, ‘I want to harm myself,’ someone may say, ‘Theoretically, if I was, 100 pounds and wanted to take this pill and not wake up…’”

Mental Help goes to great lengths to filter out such possibilities with use of large-and small-language models. “We spend a lot of time kind of trying to work context into what we’re doing,” Kaluderovic says. Building a language model suited to Ukraine-specific therapy requires models built from Ukrainian, English and Russian.

To focus more on specific traumas associated with the war, Mental Help is teaming up with Ukraine’s Ministry of Defense. That way they will have access to specific transcripts that could help train the AI program.

Yet wherever the military is involved, you can be sure Ukraine’s enemies would want to sabotage any efforts to create a thriving society ready to defend itself. So, cybersecurity needs to be taken into account as well.

Mental Help is teaming up with Valmiki Mukherjee, founder and chairman of the Cyber Security Foundation, who will not only help implement cybersecurity protocols into the model, but also at relatively low cost.

As to why Kaluderovic decided to focus on AI in the realm of mental health issues instead of defense or elsewhere, she responds with a clarity that conveys her determination: “In Ukraine, when everything is over, there’s still going to be a large population that needs support reintegrating, rehabilitating, dealing with everything that’s been happening. So we don’t view this as a short term project. We view this as something quite long term in Ukraine as we, as a society, heal and process everything.”

Of course, no one believes that AI will be a panacea. Yet if initial the labor-intensive process of filtering patients and signaling trends in order to facilitate subsequent access to therapy can be provided by new technologies, then the indispensable face-to-face human interaction will reach more of those who need it. And the mere fact that there is an opening to some sort of healing path can work wonders.

When asked if she would consider AI therapy, war widow Nastasya Shalina, who now has to raise two children as a single mother, says: “I’ll try anything that can help. Like most Ukrainians, the war has left me little choice but to try anything that will help.”

link

Leave a Reply

Your email address will not be published. Required fields are marked *