Understanding the Limitations of AI in Marketing for Legal Tech
A version of this article first appeared in the May 1, 2023 issue of Legal Tech News.
A lot of conversations have been happening surrounding the release of ChatGPT-4 and the role of AI, in general, in disrupting marketing. In fact, Salesforce’s Fourth Annual State of Marketing research report indicates 51% of marketers are already incorporating AI. Likewise, McKinsey suggests AI applications in the areas of marketing and sales could hold an approximate value of $2.6 trillion!
What really is the impact of AI in marketing within the legal tech space? Let’s explore this closely.
Smaller brands with lack of resources to invest in skilled in-house marketing and sales teams or external agencies might find the ability of AI tools, like ChatGPT, Jasper.ai, Quillbot, Copy.ai and the likes, to produce blogs, Web copy, email copy, ad copy and social media captions helpful. It’s very speedy in churning out content, and it can be a good initial resource for personalized communication and scheduling meetings where a nuanced understanding of context or stakeholder differentiation is not required.
In fact, if a brand is simply looking to churn out content across different platforms for the sake of it - to tick all the boxes - without really paying attention to the ramifications on long-term branding and positioning, these tools can be a boon. However, companies will still require a dedicated person to manually check all the copy that AI tools produce due to the reasons we list out below.
1. Lack of Real-Time Content
At the moment, tools like ChatGPT cannot access the latest real-time content. This can pose limitations if one needs the latest stats or breaking news for a piece of content. In an industry where one may need access to immediate trend alerts, AI tools cannot be part of a dynamic strategy.
2. Lack of Consistent Accuracy
AI tools do not have the ability to generate content that is fully factual. Many professionals have already pointed out endless examples of AI tools outputting fabricated facts. This is being referred to as “hallucinations.” One will have to fact-check accuracy to avoid legal issues. Imagine the irony of a legal tech company getting into legal issues due to misinformation from an AI tool that was published blindly! A member of our team recently flagged a case in which ChatGPT generated a list of law journal articles on a legal doctrine that were entirely fictitious. Even the list of sources ChatGPT shared were fake.
4. Lack of 100% Originality
Although AI tools claim to create original content, many users have flagged it for plagiarism concerns. AI tools could result in intellectual property violation. There is a genuine risk that a legal tech company could potentially receive AI generated content partially lifted from a competitor’s platform. If several brands in the same industry have similar - even if not identical - content, this could lead to lack of brand differentiation, which can impact conversions. In short, AI tools are tone deaf and don’t understand branding, positioning and creativity.
3. Lack of Subject Expertise
AI tools are not human. They cannot think with emotion or reason to express content like a thought leader who has personal experience, exclusive anecdotes to share, and unique insights based on real-life experiences. AI tools cannot churn out content that is based upon complex nuances. Leaders of legal tech companies, for example, should steer clear of using AI tools to churn out thought leadership pieces for publishing on a company website, personal blog or LinkedIn. There is no real uniqueness, emotional intelligence, subject expertise or value-add that robotic content stands to offer.
5. Lack of Unique Voice
The legal tech world is uniquely competitive yet inclusive and familial. It is important for a brand to maintain a distinct tone of voice that speaks compellingly to its distinct audience, and this brand voice must be consistent across all communication. While ChatGPT can put together marketing content and sales material in a matter of seconds, it requires an actual person to fine-tune these messages to stand out and resonate with target audiences. For example, if you look at the screenshots we’ve embedded in this article, you’ll see ChatGPT generated several options for an email campaign, which then required manual editing to put together the final version we ran with that was in line with our brand’s voice and audience’s needs.
6. Lack of Compliance
Not all AI tools meet compliance rules for security and privacy. Legal tech companies must be mindful to not share confidential client information to generate marketing content or other material. AI platforms, such as ChatGPT, save data since it is a learning platform that trains itself based on various data a user inputs. AI tools may not meet compliance laws like GDPR (i.e. the “right to be forgotten”). There could be a variety of ethical and legal concerns in using AI tools.
The Final Verdict
Are AI tools going to erase the need for marketing experts and content writers in legaltech? The simple answer is no.
AI has the potential to create content quickly based on the user’s specific prompts, including emails, blogs, social media posts, Web copy, briefs, chatbot copy and more. However, it still requires a subject matter expert to a) cross-check accuracy, b) scan and address potential plagiarism and c) tweak content to align with brand tonality.
Furthermore, for thought leadership pieces, content that needs to be sensitive in nature, nuanced pieces or material that needs to draw from real-time trends, AI tools pose hurdles.
AI generated content might be speedy, but it’s not better in every case. It is surface level for the most part and can neither exhibit the true authority of a seasoned professional nor demonstrate a human touch.