ChatGPT, write my wedding vows: are we OK with AI in everyday life?

TruthLens AI Suggested Headline:

"The Role of AI in Personal Communication: Balancing Convenience and Authenticity"

View Raw Article Source (External Link)
Raw Article Publish Date:
AI Analysis Average Score: 7.9
These scores (0-10 scale) are generated by Truthlens AI's analysis, assessing the article's objectivity, accuracy, and transparency. Higher scores indicate better alignment with journalistic standards. Hover over chart points for metric details.

TruthLens AI Summary

In recent months, the integration of artificial intelligence (AI) into personal communication has sparked a debate about authenticity and emotional expression. Nik Vassev, a tech entrepreneur from Vancouver, Canada, exemplifies this trend when he turned to Claude AI to help craft a message of condolence for a grieving friend. While Vassev typically uses AI for work-related tasks, he found value in using it for personal messages, seeking assistance in expressing feelings that he found difficult to articulate. Despite the effectiveness of the AI-generated message in fostering a deeper conversation about grief, Vassev refrained from disclosing the AI's involvement, fearing that it might diminish the perceived sincerity of his words. This sentiment is echoed by many users who, while benefiting from AI's efficiency and creativity, worry about the implications of relying on technology for heartfelt communications. Studies suggest that disclosing AI assistance can lead to negative social perceptions, as seen in cases where individuals felt betrayed upon learning their loved ones had used AI to craft significant personal messages, such as wedding vows or birthday cards.

The ongoing discourse raises critical questions regarding the balance between convenience and authenticity in relationships. While some argue that AI can enhance communication by providing support, especially for those who struggle with emotional expression, others contend that it risks undermining genuine human connection. Experts warn that over-reliance on AI can lead to a loss of emotional intelligence and self-expression, as individuals may opt for the ease of AI-generated messages over the complex, sometimes messy, process of personal communication. The philosophical implications of this shift are profound; as AI takes on more of this relational labor, it may alienate individuals from their own feelings and diminish the value of the effort put into nurturing relationships. Ultimately, the challenge lies in discerning when AI serves as a helpful tool and when it detracts from the meaningful human interactions that form the foundation of our connections with others.

TruthLens AI Analysis

You need to be a member to generate the AI analysis for this article.

Log In to Generate Analysis

Not a member yet? Register for free.

Unanalyzed Article Content

Earlier this spring, Nik Vassev heard a high school friend’s mother had died. Vassev, a 32-year-old tech entrepreneur in Vancouver, Canada, opened up Claude AI, Anthropic’sartificial intelligencechatbot.

“My friend’s mom passed away and I’m trying to find the right way to be there for him and send him a message of support like a good friend,” he typed.

Vassev mostly uses AI to answer work emails, but also for personal communications. “I just wanted to just get a second opinion about how to approach that situation,” he says. “As guys, sometimes we have trouble expressing our emotions.”

Claude helped Vassev craft a note: “Hey man, I’m so sorry for your loss. Sending you and your family lots of love and support during this difficult time. I’m here for you if you need anything …” it read.

Thanks to the message, Vassev’s friend opened up about their grief. But Vassev never revealed that AI was involved. People “devalue” writing that is AI assisted, he acknowledges. “It can rub people the wrong way.”

Vassev learned this lesson because a friend once called him out for relying heavily on AI during an argument: “Nik, I want to hear your voice, not whatChatGPThas to say.” That experience left Vassev chastened. Since then, he’s been trying to be more sparing and subtle, “thinking for myself and having AI assist”, he says.

Since late 2022, AI adoption has exploded in professional contexts, where it’s used as aproductivity-boostingtool, and amongstudents,who increasinglyuse chatbots to cheat.

Yet AI is becoming the invisible infrastructure of personal communications, too – punching uptext messages, birthday cards andobituaries, even though we associate such compositions with “from the heart” authenticity.

Disclosing the role of AI could defeat the purpose of these writings, which is to build trust and express care. Nonetheless, one person anonymously told me that he used ChatGPT while writing his father of the bride speech; another wished OpenAI had been around when he had written his vows because it would have “saved [him] a lot of time”. Online, a Redditorshared that they usedChatGPT to write their mom’s birthday card: “she not only cried, she keeps it on her side table and reads [it] over and over, every day since I gave it to her,” they wrote. “I can never tell her.”

Research about transparency and AI use mostly focuses onprofessionalsettings, where40%of US workers use the tools. However, a recentstudyfrom the University of Arizona concluded that “AI disclosure can harm social perceptions” of the disclosers at work, and similar findings apply to personal relationships.

In one2023 study, 208 adults received a “thoughtful” note from a friend; those who were told the note was written with AI felt less satisfied and “more uncertain about where they stand” with the friend,according toBingjie Liu, the lead author of the study and an assistant professor of communication at Ohio State University.

On subreddits such asr/AmIOverreactingor r/Relationship_advice, it’s easy to find users expressing distress upon discovering, say, that their husband used ChatGPT towrite their wedding vows. (“To me, these words are some of the most important that we will ever say to each other. I feel so sad knowing that they weren’t even his own.”)

AI-assisted personal messages can convey that the sender didn’t want to bother with sincerity, says Dr Vanessa Urch Druskat, a social and organizational psychologist and professor specializing in emotional intelligence. “If I heard that you were sending me an email and making it sound more empathetic than you really were, I wouldn’t let it go,” she says.

“There’s a baseline expectation that our personal communications are authentic,” says Druskat. “We’re wired to pick up on inauthenticity, disrespect – it feels terrible,” she says.

But not everyone draws the same line when it comes to how much AI involvement is tolerable or what constitutes deceit by omission. Curious, I conducted an informal social media poll among my friends: if I used AI to write their whole birthday card, how would they feel? About two-thirds said they would be “upset”; the rest said it would be fine. But if I had used AI only in a supplementary role – say, some editing to hit the right tone – the results were closer to 50-50.

Using AI in personal messages is a double gamble: first, that the recipient won’t notice, and second, that they won’t mind. Still, there are arguments for why taking the risk is worthwhile, and why ahint of AI in aHinge messagemight not be so bad. For instance, AI can be helpful for bridgingcommunication gapsrooted in cultural, linguistic or other forms of diversity.

Plus, personal messages have never been totally spontaneous and original. People routinely seek advice from friends, therapists orstrangersabout disagreements, delicate conversations or important notes. Greeting cards have long come with pre-written sentiments (although Mother’s Day founder Anna Jarvis oncescoldedthat printed cards were “lazy”).

Sara Jane Ho, an etiquette expert, says she has used ChatGPT “in situations where I’ve been like: ‘Change this copy to make it more heartfelt.’ And it’s great copy.”

Ho argues that using ChatGPT to craft a personal message actually shows “a level of consideration”.

Expressing sensitivity helps build relationships, and it makes sense that people who struggle with words would appreciate assistance. Calculators are standard digital tools; why not chatbots? “I always say that the spirit of etiquette is about putting others at ease,” she says. “If the end result is something that is nice for the other person and that shows respect or consideration or care, then they don’t need to see how the sausage is made.”

I asked Ho what she would say to a person upset by an AI-assisted note. “I’d ask them: ‘Why are you so easily offended?’” Ho says.

Plus, she says using AI is convenient and fast. “Why would you make yourself walk someplace if you have a car?” she asks.

Increasingly, people are drifting through digitized lives that reject “the very notion that engagement should require effort”, at perceiving less value in character building and experiences like “working hard” and “learning well”, author and educator Kyla Scanlonargued in an essaylast month. This bias toward effortlessness characterizes the emotional work of relationships as burdensome, even though it helps create intimacy.

“People have sort of conditioned themselves to want a completely seamless and frictionless experience in their everyday lives 100% of the time,” says Josh Lora, a writer and sociologist who has writtenabout AI and loneliness. “There are people who Uber everywhere, who Seamless everything, who Amazon everything, and render their lives completely smooth.”

Amid thisconvenience-maxxing, AI figures as an efficient way out of relational labor, or small mistakes, tensions and inadequacies in communication, says Lora.

We use language to be understood or co-create a sense of self. “So much of our experience as people is rendered in the struggle to make meaning, to self actualize, to explain yourself to another person,” Lora says.

But when we outsource that labor to a chatbot, we lose out on developing self-expression, nuanced social skills, and emotional intelligence. We also lose out on the feelings of interpersonal gratitude that arise from taking the time to write kindly to our loved ones, as one2023 studyfrom the University of California, Riverside, found.

Many people already approach life as a series of objectives: get good grades, get a job, earn money, get married. In that mindset, a relationship can feel like something to manage effectively rather than a space of mutual recognition. What happens if itstops feeling worth the effort?

Summer (who requested a pseudonym for privacy), a 30-year-old university tutor, said she became best friends with Natasha (also a pseudonym) while pursuing their respective doctorate degrees. They lived four hours apart, and much of their relationship unfolded in long text message exchanges, debating ideas or analyzing people they knew.

About a year ago, Natasha began to use ChatGPT to help with work tasks. Summer said she quickly seemed deeply enamoured with AI’s speed and fluency. (Researchers have warned the technology can beaddictive, to the detriment of human social engagement.) Soon, subtle tone and content changes led Summer to suspect Natasha was using AI in their personal messages. (Natasha did not respond to a request for comment.)

After six years of lively intellectual curiosity, their communication dwindled. Occasionally, Natasha asked Summer for her opinion on something, then disappeared for days. Summer felt likeshewas the third party to a deep conversation happening between her best friend and a machine. “I’d engage with her as a friend, a whole human being, and she’d engage with me as an obstacle to this meaning-making machine of hers,” Summer tells me.

Summer finally called Natasha to discuss how AI use was affecting their friendship. She felt Natasha was exchanging the messy imperfections of rambling debate for an emotionally bankrupt facsimile of ultra-efficient communication. Natasha didn’t deny using chatbots, and “seemed to always have a reason” for continuing despite Summer’s moral and intellectual qualms.

Summer “felt betrayed” that a close friend had used AI as “an auxiliary” to talk to her. “She couldn’t find the inherent meaning in us having an exchange as people,” she says. To her, adding AI into relationships “presupposes inadequacy” in them, and offers a sterile alternative: always saying the right thing, back and forth, frictionless forever.

The two women are no longer friends.

“What you’re giving away when you engage in too much convenience is your humanity, and it’s creepy to me,” Summer says.

Dr Mathieu Corteel is a philosopher and author of a book grappling with the implications of AI (only availablein French) as a game we’ve all entered without “knowing the rules”.

Corteel is not anti-AI, but believes that overreliance on it alienates us from our own judgement, and by extension, humanity – “which is why I consider it as one of the most important philosophical problems we are facing right now”, he says.

If a couple, for example, expressed love through AI-generated poems, they’d be skipping crucial steps of meaning-making to create “a combination of symbols” absent of meaning, he says. You can interpret meaning retrospectively, reading intent into an AI’s output, “but that’s just an effect”, he says.

“AI is unable to give meaning to something because it’s outside of the semantics produced by human beings, by human culture, by human interrelation, the social world,” says Corteel.

If AI can churn out convincingly heartfelt words, perhaps even our most intimate expressions have always been less special than we’d hoped. Or, as tech theorist Bogna Koniorrecently wrote: “What chatbots ultimately teach us is that language ain’t all that.”

Corteel agrees that language is inherently flawed; we can never fully express our feelings, only try. But that gap between feeling and expression is where love and meaning live. The very act of striving to shrink that distance helps define those thoughts and feelings. AI, by contrast, offers a slick way to bypass that effort. Without the time it takes to reflect on our relationships, the struggle to find words, the practice of communicating, what are we exchanging?

“We want to finish quickly with everything,” says Corteel. “We want to just write a prompt and have it done. And there’s something that we are losing – it’s the process. And in the process, there’s many important aspects. It is the co-construction of ourselves with our activities,” he says. “We are forgetting the importance of the exercise.”

Back to Home
Source: The Guardian