CONTRIBUTED BY
Karolina, ExpertHub Team
DATE
Jul 28, 2025
As humans, we have tendency to polarize around issues. And so, it comes as no great surprise that we are currently polarized around the hot theme of global AI integration and its consequences. One side is willing to acknowledge it as the source of endless potential, placing a lot of hope in this technology. The other side fears its influence on our jobs and lives in general, shook by the thought of its reputed autonomy.
Yes, as always, there is also a small bunch of those in the middle, aware of its security implications and transformative power. But emotional reactions are there. Not only at this global level, but at the level of personal, individual user interactions. As AI enters human experience, it also correlates with human well-being and psychological condition.
More and more literature is written on how AI chatbots assume nowadays the role of modern therapists. Especially in the case of ChatGPT, which gained reputation as an online space for a stay-at-home-therapy. Projects like Therapist GPT or TherapyAI, utilizing GPT platform, are increasingly popular. Articles on how to use this tool for counselling, coaching, and how to prompt it for a more concentrated therapy, span the web today.
Marta Żabicka, a student of Psychology and Computer Science at SWPS University in Warsaw, shares with us some insights into this theme and demonstrates her research project – VincentBot.
AI is still “artificial”
Marta’s insight is unique, because on one hand she is an innovation-oriented AI enthusiast, on the other – she is passionate about human-only aspects of psychology and believes in the importance of human interaction in therapy. To Marta, these two aspects can go hand in hand.
Among my fellow students and the academic community, those professionally connected to this field – there’s definitely belief in AI’s potential, endless curiosity about it, and absolutely no fear. But among people with limited contact with this technology, there’s much more fear: that AI will become self-aware, that it will start controlling itself. Those feelings of fear are also an interesting aspect of psychology.
I do not subscribe to the idea that AI would take a psychologist’s jobs. We’re still far from being able to send someone to a one-hour therapy session with an AI as the only therapist and expect an adequate standard of care and support. AI still often loses context, hallucinates, and doesn’t think on as many levels as a human. It also doesn’t provide the feeling that the interaction is between two real people who feel. That the other person sees, hears, senses the emotions expressed during the session. That they can refer to personal examples that feel believable. Many people still say, in this regard, “AI is a bit artificial”.
Opportunities of understanding
But there are more opportunities that this technology brings in, on both industry level and in individual care. ChatGPT and other AI-based therapeutic tools can be useful in ad-hoc assistance, quickly providing some extent of a mental clarity. This is especially helpful when you need to talk about something and challenge a negative thought urgently but don’t have a friend nearby.
In the field of psychology, as Marta points out, AI chat tools can support both therapists and patients, helping therapists become more effective and encouraging patients to feel more confident in seeking help. If someone is ready for real-life therapy, they’ll pursue it. In the past, the internet provided forums and other anonymous ways to share personal struggles, but that didn’t stop people who wanted from going to therapy. But now, AI can encourage those, who have always been reluctant.
For example, AI could be used to carry out an initial assessment before a therapy session. Many people find it easier to open up to a chatbot, as there’s often less shame or fear of judgment. This way, by the time someone meets with a therapist, there’s already a preliminary overview of their symptoms and concerns. At this stage, such tools are best seen as practical aids that complement the work of therapists—not as replacements.
See for yourself – VincentBot Research
“I want to verify how humans feel after interacting with a chatbot”, says Marta, when introducing us to VincentBot – a chatbot she developed for the purposes of her academic research, under the supervision of Dr Maksymilan Bielecki.
How does the interaction with VincentBot differ from a regular ChatGPT experience? What is the objective of conducting such research? What other points of contact between AI and psychology will Marta discover?
We will reveal more details after the research results are official. In the meantime, we will observe the project growing and we encourage you to do the same:
If you speak Polish, we invite you to take part in Marta’s research and follow its progress. You can participate via this link. It takes around 15 minutes.

At ExpertHub, we are happy to see innovations like that grow among young and promising experts at the intersection of AI with other dimensions.
With the right level of interest and resources put into VincentBot, it can transform into a larger project and even a startup. I thought about this, and I am looking forward to seeing where it is taken in the future!
Don’t forget you can share your experience with Marta's experiment by sending us a message at hello@joinexperthub.com.