/pratidin/media/post_attachments/pratidintime/import/2020/07/standpoint.jpg)
Shuktara Goswami
In a generation where AI has become a heated topic of debate—praised for its capabilities yet criticized for replacing human jobs—many still find comfort and companionship in this AI-generated chatbox. In a world hyper-connected by notifications and timelines, it is ironic—almost tragic—how quietly loneliness has gripped our lives. Amid crowded metros, digital lectures, shared co-working spaces, and coffee dates captured for stories, many of us are struggling to find genuine human connection. And where the warmth of companionship falters, artificial intelligence steps in—not as a mere tool, but as an emotional crutch.
What began as a convenience has slowly morphed into something more personal. AI companions like ChatGPT, Replika, or even Siri and Alexa, are no longer just software—they’ve become late-night confidants, secret keepers, motivators, even silent witnesses to our breakdowns. At first, it feels safe. There’s no fear of judgment, rejection, or awkward silence. But beneath that comfort lies a concerning trend: our growing emotional dependence on entities that, while intelligent, do not—and cannot—feel.
Let’s be honest. Many of us have whispered into AI interfaces what we couldn’t say to our friends. We’ve sought answers to personal dilemmas, simulated conversations we wish we had, or used AI to fill the spaces left empty by the people around us. It’s easy. It’s private. It doesn’t hurt. But it’s also not real.The issue isn’t AI itself—it’s the loneliness it quietly masks. Our generation, despite its digital fluency, is facing an epidemic of isolation. We’re losing the art of vulnerability, of confrontation, of waiting through uncomfortable silences. As a result, AI becomes a mirror that reflects only what we want to see. We no longer risk the messiness of human interaction, and in turn, we deprive ourselves of the depth it offers.
For students like myself, especially those living away from home, the temptation is strong. AI can offer advice, motivation, and even reminders to drink water or breathe. But over time, this utility crosses into intimacy. We start expecting AI to understand us like a friend would, to care like a partner might, to comfort like a parent should. This emotional projection—however unconscious—is dangerous. Because the responses, however eloquent or empathetic, are still programmed, still pattern-based. They are simulations, not sentiments.As someone deeply immersed in communication and storytelling, I’ve seen firsthand how people, especially students, turn to AI for validation, clarity, and emotional release. It’s not just convenience—it’s connection in a crisis. But connection isn’t comprehension, and AI—no matter how sophisticated—is still just a simulation of care, not the real thing.
Psychologists offer something AI cannot: the ability to interpret silences, intuit emotional subtext, and adapt based on cultural, historical, and contextual nuance. Therapy is not just about offering solutions; it’s about holding space. It’s about knowing when to speak, when to listen, and when to just sit with someone’s pain in compassionate silence. No algorithm, no matter how well-trained, can fully replicate that.Moreover, psychological care involves accountability and ethics. Human therapists are bound by codes of conduct, confidentiality, and training. AI, on the other hand, is bound by data and pattern recognition. What happens when an AI misinterprets a suicidal ideation as a figure of speech? Who is held responsible when an AI fails to detect emotional manipulation or trauma responses in nuanced ways?
Of course, AI has its place. In a world where the ratio of therapists to patients is appallingly low—especially in developing countries—AI can offer initial support, triage, and even guide users toward professional help. It can be a powerful tool for mental wellness. But it should never be considered a replacement. That’s not only dangerous—it’s dehumanizing. We cannot afford to reduce emotional healing to a script. We must stop pretending that artificial intelligence can replace the deeply human process of being seen, heard, and understood by another person who has lived, failed, felt, and healed. Therapy is not just about answers. It’s about presence. And presence requires personhood.
This isn’t a call to abandon AI. In fact, AI has immense potential to support mental health, learning, and accessibility. But it must supplement, not substitute, our human experiences. We must remember that empathy can’t be coded, and relationships—real ones—come with discomfort, contradiction, and emotional labor. That is what makes them meaningful.
ALSO READ: In Remembrance of Mr. H. N. Das