Baltas Group
ChatGPT is one of the most popular artificial intelligence tools in recent years. It answers questions, generates ideas, and even provides emotional support through conversations. Its easy accessibility, quick solutions, and contribution to learning processes make it attractive. However, research highlights not only its benefits but also its hidden psychological risks.
The Supportive Aspects of ChatGPT
Scientific studies show that chatbot tools like ChatGPT have positive effects, particularly on individuals experiencing mild to moderate anxiety or depression. A comprehensive review combining 18 clinical studies revealed that these chatbots significantly reduce symptoms of anxiety and depression (APSA, 2024). During interactions, users not only experience a decrease in symptoms but also report a supportive bond and a sense of trust, which stands out as a remarkable finding.
This supportive effect is not limited to mental well-being; it also promotes creativity and learning. Its fluent and flexible language makes it easier for users to generate new ideas, helps them develop different perspectives, and supports problem-solving processes. Additionally, the diverse content it offers accelerates access to information and enriches the learning experience. In this sense, ChatGPT is not just a system that transfers information; it stands out as an assistant that enhances productivity, innovative thinking, and learning (Frontiers in Psychology, 2023).
Where Do the Risks Begin?
Despite all these benefits, clinical observations and recent research show that ChatGPT use also carries certain psychological risks. A study conducted by MIT Media Lab (2025) found loneliness, emotional dependency, and a decrease in social relationships among individuals who use ChatGPT intensively. The longer the usage, the more pronounced these negative effects became.
At this point, compulsive ChatGPT use emerges as a significant risk factor. Compulsive use means that a person uses ChatGPT frequently and repetitively without control. Users tend to chat, ask questions, or open the platform just to regulate their emotions at every opportunity. While this may initially provide short-term relief, in the long run, it creates a cycle similar to addiction. A study published in ScienceDirect (2024) revealed that compulsive ChatGPT use is associated with mental fatigue, burnout, and a decrease in quality of life. This addiction-like usage not only affects psychological health but also social relationships. The phenomenon known as artificial intimacy causes some users to form strong emotional bonds with ChatGPT, pushing real relationships into the background. This leads to deepened social isolation and increased feelings of loneliness.
These effects are not limited to emotional domains; they also weaken cognitive processes and decision-making abilities. A study observed decreased brain activity, reduced motivation, and a decline in creativity among students who became overly dependent on ChatGPT (Washington Post, 2025). This situation may cause users to sideline their own thinking processes, weaken their critical perspective, and rely more on artificial intelligence for decisions. Young people, in particular, are more vulnerable to these risks. The tendency of young users with limited social circles to bond with ChatGPT leads them to distance themselves from real relationships and shape their decisions according to AI guidance without questioning them. OpenAI CEO Sam Altman openly states that there is an excessive emotional reliance on AI among young people, which can have dangerous consequences for both individual development and healthy decision-making processes (Business Insider, 2025).
Conclusion: Balanced Use Is Essential
When used correctly, ChatGPT can be a powerful tool for gaining information, learning, and personal development. With its ability to provide quick answers, support creativity, and offer emotional companionship, it holds significant potential. However, it should not be forgotten that compulsive use, artificial intimacy, and cognitive decline are psychological risks that may overshadow ChatGPT’s supportive aspects and must be approached with caution.
Source:
You can contact us to get information about Personova developed by Baltas Group.
63-66 Hatton Garden
London, England, EC1N 8LE