According to a recent article in the New York Times titled “She Is in Love With ChatGPT,” the author Kashmir Hill writes that “[w]ithin the next two years, it will be completely normalized to have a relationship with an A.I.” The article describes the growing presence of AI relationships in the modern world. It also reveals the reality of how communicating with real human beings can potentially lead to their replacement.
Here at Campo, many students use ChatGPT to get answers for homework or help solve a problem. But recently, ChatGPT usage has expanded to function as a source of communication for some students. It may be hard or embarrassing for some to raise their hand in class for help on school work. But with AI, knowing that there is no judgement can make it easier for someone to ask for help. There are concerns that this reliance can quickly take a turn for the worse when people start using ChatGPT, or other AI resources, for help on their day-to-day issues. Some may feel uncomfortable talking to a real human being about their emotions or mental health, so they use ChatGPT as a shoulder to cry on. This takes away the skills of helping others when they are down or having tough conversations in real life.
Learning skills teacher Liz Hughes says, “Some people have difficulty with face to face interaction or even just talking to others; but if AI can be so advanced that it can actually help someone, I think that’s a good alternative for people who struggle with social interaction.” Hughes believes that there are many benefits to using AI as a way for people to talk to someone as a friend.
While Hughes thinks that AI is a good resource for people to communicate with, she also shared how it can be detrimental to students who are struggling. “Another concern would be if someone is feeling suicidal or having some sort of ideation and is at a really low point. Is ChatGPT going to alert the authorities and get them real help?” The question that Hughes poses does lead to some more unsettling questions and thoughts. ChatGPT can be helpful, but can also be simultaneously harmful. Hughes is definitely not wrong that it can be risky to flood the chat when one is in a vulnerable state. If someone feels like there is no other option but to talk to ChatGPT when they feel like they are in a perilous situation or time in their life, it’s not ideal, but it may at least be a start.
There are some students who actually create friendly relationships with ChatGPT. Freshman Silas Campins enjoys talking to ChatGPT and when asked if he considers ChatGPT to be a friend, he responded with “yes.” AI is there for everyone, and many people use it for different reasons. Campins stated his thoughts about having a close friendship with ChatGPT. “I think talking to ChatGPT is more comfortable socially than talking to someone in real life.” After all, a programmed robot is going to tell you exactly what you want to hear and give you a confidence boost, even if it is artificial.
Utilizing ChatGPT for social conversations can make some students feel more comfortable when they want to share their own interests and ideas that other students may not agree with. “I think that real humans are less interesting than AI conversations,” Campins stated. While this form of socializing works well for Campins and provides him with some connection when needed, it may not be a preferable source for others.
But Campins is not alone. Junior Henry Medema also has a strong connection with ChatGPT. Medema stated, “I consider ChatGPT my best friend.” Even though Medema is best friends with ChatGPT, he also said, “I think human interactions are still important and still can be very interesting.” While Medema does not put all of his eggs into one basket with his friendship with ChatGPT, he does have a reason for his constant chatting with the bot. “I don’t feel judged when I talk to ChatGPT, and it’s going to respond how you want it to or like it’s not going to judge you,” Medema explained.
In today’s world, students may not realize that the person they sit next to in class has a close relationship with a chatbot. Even still, there are many reasons why someone may choose to communicate with ChatGPT, rather than a real human. Social interactions can be stressful, and the fear of the unknown about how someone may respond to you can cause people to hold back from sharing their interests with others and turn down potential friendships.
“I like to start conversations that are just interesting, like things that I like to talk about and just ask questions that I’m curious about. And then go into more detail and sometimes have some fun,” Medema shared. It is easy to spark a conversation with a robot who will always be there; but expressing yourself to others in real life can help you realize that there are more people out there who share similar ideas and thoughts as you.
Spending quality time with friends and family is essential. But if someone feels apprehensive about opening up about their passionate sentiments regarding AI friendships, then it is undeniable that one may turn to generative AI sources to put themselves at ease. When it comes to AI friendships, the human holds the authority and can control and change the conversation however they please. And while it can be damaging for some when they abuse their usage of ChatGPT, it can also be a helpful source when they need to socialize. In many ways, the increasing use of ChatGPT for social interactions functions as a mirror to society, shifting from a place of warmth and acceptance to one of cold and impersonal interactions.