Why You Shouldn’t “Hire” AI to Draft your Cease & Desist Letter

business people using artificial intelligence AI

Yes, you read that right: AI is a traitor, and it doesn’t even know it. It’s only been a couple of years of AI’s predictive text working for our clients, and I’ve found myself reviewing these GPT and Claude drafts for people often. It’s hard to explain why these are bad, but they are. They’re not just “bad,” they actually make your case worse for you, oftentimes. Let me break it down. 

Cease and Desist Letters, AKA C&Ds, Demand Letters, Nastygrams, whatever you call them, are the first step in building your case against the other side. It’s a little like claiming your territory in the initial turns on a game of Risk, or buying certain properties on your first go-round on a Monopoly board. Your decisions matter. They have large effects for the entire game. Your first Demand letter identifies your legal position. You typically make your strongest legal argument first and show some evidence to support that position. 

When first reading AI generated C&D letters, having reviewed a ton of newbie attorney-drafted write-ups, I was impressed. The language was advanced-sounding. It calls upon numerous terms of art in the legal world that have special meaning (outlined by hundreds of years’ worth of case law opinions and statutes). But when I actually lay out the legal argument being made, some massive problems show up. 

The Importance of Strategizing Before Writing a Cease and Desist Letter

Before you write a C&D, you have to take a step back and strategize. In any given case, there will be multiple legal arguments and positions available to you. As a non-lawyer, you’re probably aware of a couple of them, but not all. Strategizing requires that you mark out your territory and come up with a route for your success—if you know the laws at issue and how these cases typically “play out,” you can inoculate yourself from, say, a rattlesnake that’s on the trail in stage 3, by taking a route that doesn’t care about the rattlesnake or avoids the rattlesnake altogether. Obviously, I’m speaking in metaphors, but hopefully this makes sense. 

More importantly, all of your legal arguments and positions have to “jive;” they have to dovetail together and support one another. If you argue that (1) all clouds are white, (2) the sky was filled with clouds, therefore the entire sky was white, you can’t make an argument at the same time that clouds were gray or that it was a sunny day, that day. 

AI doesn’t adjust for these, though; it just sounds really smart. With newbie attorneys in training, we get the reverse problem: we talk through the overall strategy of a case before they begin writing anything, and then they lay out their arguments and the language they use reads amateur or they use some incorrect terminology (AI does this, too), but it’s easy to spot and the overall gist of the letter is there—it makes sense strategically. 

In contrast, AI-generated letters do not take this initial step to strategically map out your legal positions now and in the future of the case. It gives you flavorful, expert-sounding drafts that lay out one path and then immediately undermine that core argument with an entirely different argument (clouds are gray / the sky was sunny / there were fireworks in the sky that made the sky pink and green). 

The Consequences of Using a Cease and Desist Letter Written by AI

A good lawyer can take these AI-generated letters and use them against you to gut your case. There’s a rule of evidence law that anything you say can be used against you (sound familiar?). Well, whatever AI writes for you that you send to an opposing party counts as you saying that statement! You can’t, later on, when you hire a real attorney to get you out of this mess, go back on what you said—that’s impeaching your own testimony! It makes the jury lose trust in you! How can they believe what you say, when you said something totally different just before? And if your defense is “I put a prompt into Claude, that wasn’t me,” then you’re telling the jury that you didn’t care enough about this case to get it right, initially, but they should what, believe you now?

ChatGPT, AI, Claude, whatever it is you’re using, has a strong tendency to shoot you in the foot. And no, you can’t just fix this with a better prompt, because AI doesn’t currently understand the journey of a case, the pitfalls and traps in taking certain positions (the rattlesnakes down the trail). Maybe it will in the future, but right now, it’s making people’s cases worse.