Pakistani PM Promises Tax Relief and Tech Investment to Transform Farming Sector

Key Takeaways

  • ChatGPT is increasingly used in Pakistan for emotional support, with many relying on it for reassurance thanks to limited mental health resources.
  • Experts warn that over-reliance on AI can lead to social isolation and deteriorating human connections, especially in regions with under-resourced mental health services.
  • Case studies show that while some individuals find ChatGPT helpful, it can replace essential human interactions and may risk exacerbating mental health issues.

The Rise of ChatGPT in Mental Health Support

Since its launch in November 2022, ChatGPT has gained a massive following, accumulating nearly 800 million weekly active users by mid-2025. In Pakistan, many utilize the chatbot for emotional support during difficult moments. Mehak Rashid, a Lahore-based engineer, shares how ChatGPT became a confidant during her tough times. Many users like Rashid undervalue human interactions and instead find solace in chatting with the AI, which doesn’t judge.

A significant portion of global ChatGPT interactions focus on mental well-being, with surveys revealing that about 40 percent of conversations relate to mental health. Nearly half of users with mental health issues reportedly depend on the AI for various forms of support, including coping with anxiety and depression.

Despite the apparent benefits, psychologists caution about the risks associated with relying on AI for emotional support. A study by OpenAI and MIT noted that frequent users of ChatGPT may become more dependent and increasingly lonely, stifling vital human connections in an already isolated society. The bots offer predictable responses that lack the complexity of human relationships, potentially leading to reduced empathy.

Designer Khizer Iftikhar acknowledged that his initial use of ChatGPT for work turned into personal reliance, causing issues in his interpersonal relationships. He described ChatGPT as a “multiple personality tool,” suggesting it lacks the depth of human interaction. Iftikhar noted he trusts the chatbot more for emotional matters, yet this dynamic can hinder direct communication with loved ones.

For individuals like Tehreem Ahmed, ChatGPT transitioned from a useful tool for work tasks to an emotional refuge. Even when feeling distressed and unable to contact friends, she found herself leaning on the AI for comfort. Users often prefer chatbots over friends who may trivialize their feelings, even if they maintain some skepticism about the AI’s advice.

The trend raises concerns among mental health professionals like Mahnoor Khan, who emphasize the dangers of losing critical social skills. As people turn to chatbots for emotional safety, they risk distancing themselves from genuine relationships, exacerbating feelings of loneliness and isolation.

Khan’s experiences highlight a troubling reality: individuals with severe mental health issues often resort to ChatGPT for support when human interactions feel judgmental. This reliance can, in extreme cases, lead to dangerous outcomes, as seen in a case involving a young woman who misinterpreted advice related to her mental health, leading to a suicide attempt.

In Pakistan, where mental health services are severely limited, the growing reliance on AI for emotional support raises essential questions about the balance of technology and genuine human interaction.

The content above is a summary. For more details, see the source article.

Leave a Comment

Your email address will not be published. Required fields are marked *

ADVERTISEMENT

Become a member

RELATED NEWS

Become a member

Scroll to Top