When Our Words Stop Sounding Human
Author Allison Carmen discusses the harm of using AI in communications with others in all manner of writing, including email messages.
Artificial intelligence (AI) can enhance human communication by improving the quality of our writing, voice, or appearance. But researchers have also found that AI-mediated communication carries real risks. It can increase deception, compromise authenticity, and lead to mistrust.
As a published author, I never imagined that an AI tool could fracture a professional relationship I deeply valued until it happened to me.
After a difficult meeting, a colleague I’d mentored sent me an email that didn’t sound like her. It was smooth, perfectly phrased, but somehow hollow. It was the kind of message that looks fine on the surface but feels wrong in your gut. Her words felt hurtful, not because they were cruel, but because they didn’t sound real. I didn’t have to wonder if she’d used ChatGPT. I knew she had.
That knowledge landed like a small betrayal. We’d spent years building trust through mentoring sessions, shared projects, and honest feedback. And now, in a moment that called for vulnerability, she had handed her voice to a machine.
Still, I wanted to understand why the message felt so wrong. So, in a moment of dark irony, I pasted it into ChatGPT and asked what it thought.
It read the email and replied: “This message sounds like it was written by AI.”
Then it added: “The tone is professional and polite, but emotionally flat. Some phrases, like ‘I was disheartened’ or ‘wasn’t given the opportunity,’ suggest frustration that’s been softened or sanitized. It reads as if the writer wanted to sound composed more than genuine. The overall effect is detached—the kind of distance often seen in AI-generated text.”
I remember just staring at the screen for a moment. Somehow that made it sting even more. The message that should have been human had been ghostwritten by something incapable of caring.
I wasn’t angry that she had feelings about the meeting; that’s normal. If she had called and said, “Hey, that meeting was rough,” I would have listened. Even a short, awkward email in her own voice would have been fine. But this email had perfect sentences and no warmth or risk.
It reminded me of that Friends episode where Joey helps Monica and Chandler write their adoption letter and uses a thesaurus on every word. Their simple line, “They’re warm, nice people with big hearts,” becomes “They’re humid, prepossessing Homo sapiens with full-sized aortic pumps.”
When someone uses AI to express emotion, it’s like running their heart through a thesaurus. The message may sound articulate, but it loses its pulse. In my books and my writing, it’s always the human connection that gets my readers to read and keep reading.
Recent surveys reflect that discomfort. In a 2025 Pew Research Center study, most U.S. workers said they are more worried than hopeful about AI’s growing role in the workplace. And research from the USC Marshall School of Business found that when managers relied heavily on AI to compose emails, employees perceived them as less sincere and less caring. The study noted that limited AI assistance did not harm perceptions of professionalism or warmth, but when most of a message was generated by tools like ChatGPT, it came across as inauthentic and damaged trust.
ChatGPT and tools like it can help us write better, clearer, and faster. But they can’t help us be more human. We can use AI to polish our writing, not our feelings. To fix grammar, not soften truth. Every professional relationship has a personal side to it, built on trust, respect, kindness, and a sense of relatability that fuels success and meaning in the workplace. No technology, no matter how advanced, can carry the weight of a relationship or the grace of repair.
After much consideration, I spoke with the employee and told her directly how her ChatGPT email made me feel. I told her I understood why she wanted to use it because she was upset and wanted to be perfect with her words to express herself. But I also let her know it created distance between us that can breed mistrust, not from malice, but from misalignment. We discussed simply picking up the phone or sending a brief email to speak. We then had the opportunity to talk about what happened at the meeting and resolve our issues.
However, it took months to rebuild trust and ease the awkwardness of the conflict before we found our way back to the comfort and collaboration we once had. It took time and dedication from both of us, which I am not sure every company has the organizational structure, culture, or time to repair these types of ruptures.
We are now contemplating a policy in our workplace of no ChatGPT for certain types of internal emails, Teams messages, and patient communication. And my hope is that others will consider this in their personal relationships as well.
The real lesson is that when we outsource our emotions to technology, we risk outsourcing our humanity too. Emails are a slippery slope—when will it be books? In relationships built on trust, we don’t need perfect words. We just need our own.
Check out Allison Carmen's The Gift of Maybe here:
(WD uses affiliate links)









