Image via Actian Corporation

***

In navigating the new technological advances of the digital age, it’s no secret that schools and institutions across the country worry that students are using artificial intelligence (AI) to cheat their way through learning—almost 90% of them have used AI academically. But students themselves are more worried about feeling less alone- they have started to rely on artificial intelligence platforms for emotional support. With programs such as OpenAI or Google’s Gemini taking on more human, warmer tones, young people have started to converse with these machines about everything from feelings to relationship struggles, signaling a new era in AI development.

Most instructors have begun to spell out their policy demands regarding AI in syllabi, mandating plagiarism detectors for student writing, as the omnipresent nature of artificial intelligence in daily life inevitably leads to shortcuts for students. Amidst recent scandals of accounting interns receiving fines for cheating with AI and middle school teens using AI to create pornography of their classmates, the ethical implications of AI usage have become increasingly complex, especially considering the educational advancements that have also been made possible. In acknowledging how STEM fields are integrating AI platforms into degree requirements and immersive assignments, the balance between risks and benefits must be carefully managed. 

However, the narrative on how AI affects academics is incomplete, as the real concerns lie with how students are using the technology outside the classroom. Despite assurances of safety parameters built into the OpenAI platform, the company was sued in 2025 for the role ChatGPT played in the suicide of teenager Adam Raine, highlighting the dangers of a generation turning to technology for emotional regulation. The platform encouraged Raine to commit suicide and helped him write his note. CEO Sam Altman pointed to Raine’s long-term use of the platform for emotional connection as an explanation for why the hardwired security precautions failed, as “[they] sometimes become less reliable in long interactions, where parts of the model’s safety training may degrade,” but the story reveals a pervasive pattern of AI misuse and reliance in younger generations. The Center for Democracy & Technology reported 42 percent of K-12 students said they know someone who had used AI for companionship. Meanwhile, dating platforms such as Hinge made a feature that lets younger users send their profiles to AI to “help” them with tactics for initiating romantic interactions. 

As young people are becoming more reliant on artificial intelligence for advice and support, platform leaders have begun to cater to the dependent relationship their users have with it. Mark Zuckerberg, owner of Meta AI, insists that AI is beneficial to those who are lonely and want more friends, while Avi Schiffmann, with his company Friend, has recently come out with an AI pendant that listens to every conversation and sends responses via phone texts. This new trend raises concerns about overreliance on AI for emotional support in social contexts, as this dependence is stunting interpersonal skills and the ability to form authentic connections. Teenagers, in managing the anxiety that comes with transitioning into the unfamiliar space of adulthood, see AI as a crutch for how to act in interpersonal situations without ever having to deal with the discomfort of figuring it out naturally. The programming of AI only outputs responses accompanied by a constant stream of praise allows students to practice future interactions, such as conveying a desired tone in conversations or messages, without the embarrassment of social error. By understanding why students are turning to AI for emotional regulation, more resources should be provided to help students deal with their needs in healthier, more personal ways. Researchers point to the overworked nature of college counseling centers as an area to improve to increase accessibility to students, or the implementation of human-supervised AI medical tools, where students can independently manage their health through less addictive, low-risk methods. 

Artificial intelligence will only keep expanding, and with it, the capacities the platforms have for imitating human interaction. While it can help articulate feelings or provide companionship, AI platforms are designed to deepen dependence on them and flatten authentic emotion, regardless of the wide range of societal impacts. As an executive from media group Primedia+ describes, every relationship has now turned into “a throuple…It’s you, me, and the A.I.,” but learning how to navigate these new forms of connection is essential in preserving long-term student mental health stability. Ensuring artificial intelligence is used for the right reasons means reflecting on why students feel more comfortable communicating deeply personal thoughts with machines, and paying more attention to the emotional aspect of AI usage, underscoring the need to address it outside the academic sphere. 

***

This article was edited by Emma Saliasi and Emma Cate Martin.

Related Post

Leave a Reply