‘AI is just amplifying that weakness’: The dangers of having AI draft difficult conversations for you
AI emails are proliferating across industries. In October, Linked In’s CEO Ryan Roslansky said he uses AI for almost every “super high-stakes” email he sends. And a recent survey from the email verification software company Zero Bounce found that one in four respondents admit to using it daily for drafting or editing their own emails. On Reddit, employees swap stories about bosses who use AI “to answer every email at work and thinks no one notices” or who “only communicate through AI-generated emails and it’s giving me anxiety.” When unsure, the most realistic response is to use AI too. Plug your message into a chatbot, tweak what comes out, and send it back. But if you receive a message that was likely written by AI, especially in the midst of a disagreement, you can tell—something’s off. It sounds a little too well drafted. The tone is reasonable and balanced. And while the problems are addressed, there’s something missing: the voice of the person you’re communicating with. (A dead giveaway, of course, is when the prompt is left in.) Emails may sound smoother this way, but experts worry that outsourcing difficult conversations also bypasses the relationship-building that makes workplaces function. When you ask a chatbot to rewrite your message to be more “concise” or “professional,” it can also strip away the emotional substance of the exchange—an act that may be shaping the future of work for the worse, incubating a generation of professionals who can’t talk to one another. The great social offloading There is some reported benefit to “dry-chatting” with AI—practicing tricky topics with a bot first so you can tackle the issue directly and clearly with someone afterward. Used as rehearsal, AI can be an effective tool in buildi