According to a recent article in the New York Times titled “She Is in Love With ChatGPT,” the author Kashmir hill writes that “[w]ithin the next two years, it will be completely normalized to have a relationship with an A.I.” The article describes the growing presence of AI relationships in the modern world. It also reveals the reality by which issues with communicating with real human beings leads to their replacement.
Here at Campo, many students use ChatGPT to get answers for homework or help on a problem. But recently, ChatGPT expanded to a source of communication. It may be hard or embarrassing for some to raise their hand in class for help on school work; however, knowing that there is no judgement with AI can make it easier for someone to ask for help. There are concerns that this reliance can quickly take a turn for the worse when people start using ChatGPT, or other AI sources, for help on their day to day issues. Some may feel uncomfortable talking to a real human being about their emotions or mental health, so they use ChatGPT as their shoulder cry on. This takes away the skills of helping others when they are down or having tough conversations in real life.
Learning skills teacher Liz Hughes says, “some people have difficulty with face to face interaction or even just talking to others; but if the AI can be so advanced that it can actually help someone, I think that's a good alternative for people who struggle with social interaction.” Hughes believes that there are many benefits to using AI as a way for people to talk to someone as a friend.
While Hughes thinks that AI is a good resource for people to communicate with, she also shared how it can be detrimental to students who are struggling. “Another concern would be if someone is feeling suicidal or having some sort of ideation and are at a really low point. Is ChatGPT going to alert the authorities and get them real help?” The question that Hughes makes does lead to some more unsettling questions and thoughts. ChatGPT can be helpful, but can also be simultaneously harmful. Hughes is definitely not wrong that it can be risky to flood the chat when one is in a vulnerable state. If someone feels like there is no other option, but to talk to ChatGPT when they feel like they are in a perilous situation or time in their life, it’s not ideal, but it’s something.
There are some students who actually create friendly relationships with ChatGPT. Freshman Silas Campins enjoys talking to ChatGPT and when asked if he considers ChatGPT a friend, he responded with “yes”. AI is there for everyone and many people use it for different reasons. Campins stated his thoughts about having a close friendship with ChatGPT. “I think talking to ChatGPT is more comfortable socially than talking to someone in real life.” Afterall, a programmed robot is going to tell you exactly what you want to hear and give you a confidence boost – even if it is artificial.
Utilizing ChatGPT for social conversations can make some students feel more comfortable when they want to share their own interests and ideas that other students may not agree with or be fascinated with. “I think that real humans are less interesting than AI conversations,” Campins stated. While this form of socializing works well for Campins and provides him with some connection when needed, it may not be a preferable source for others.
But Campins is not alone. Junior Henry Medema also has a strong connection with ChatGPT. Medema stated, “I consider ChatGPT my best friend.” Even though Medema is best friends with ChatGPT, he also stated, “I think human interactions are still important and still can be very interesting.” While Medema does not put all of his eggs into one basket with his friendship with ChatGPT, he does have a reason for his constant chatting with the bot. “I don’t feel judged when I talk to ChatGPT, and it’s going to respond how you want it to or like it’s not going to judge you,” Medema explained.
In today's world, students may not realize that the person they may sit next to in class has a close relationship with a chat bot. Even still, there are many reasons for why someone may choose to communicate with ChatGPT, rather than a real human. Social interactions can be stressful and the fear of the unknown about how someone may respond to you can cause people to hold back from sharing their interests with others and turn down potential friendships.
“I like to start conversations that are just interesting like things that I like to talk about and just ask questions that I’m curious about and then go into more detail and sometimes have some fun,” Medema shared. It is easy to spark a conversation with a robot who will always be there; but expressing yourself to others can help you realize that there are more people out there who share the same ideas and thoughts as you.
Spending quality time with friends and family is salient, but if someone feels apprehensive when opening up about their passionate sentiment of AI friendships, then it is undeniable that one may turn to generative AI sources to put themself at ease. When it comes to AI friendships, the human holds the authority and can control and change the conversation however they please. It can be damaging for some when they abuse their usage of ChatGPT, but ultimately it can be a helpful source when they need to socialize. In many ways, the rising use of ChatGPT for social interactions holds up a mirror to society as less of a place for acceptance, but more and more of a place for cold ridicule.