AI for Lawyers: A Roadmap to Using AI in Your Law Firm (Part 2)

Melissa Rogozinski
11.29.2023 02:05 PM Comment(s)

AI for Lawyers: A Roadmap to Using AI in Your Law Firm (Part 2)

by Melissa Rogozinski, Chief Executive Officer and Steve Salkin, Editor-in-Chief of Law Journal Newsletters


A version of this article first appeared in the November 2023 issue of Cybersecurity Law & Strategy


The use and integration of technology into the practice of law is among the most exciting and controversial subjects being addressed in law firms, courtrooms and law schools everywhere. The reality of AI is causing many legal professionals to wonder whether they may be replaced by the ever-evolving technology.

 

At a September panel discussion sponsored by the Miami Dade Bar Association Law and Technology Committee, the experts unanimously concluded that AI will not replace lawyers, but lawyers who use AI effectively will replace lawyers who don’t. Panelists included Stephanie Wilkins, Editor-in-Chief of Legaltech News, Steve Salkin, Editor-in-Chief of Law Journal Newsletters, and Ralph Losey, Esq. of Losey Law, PLLC.

 

Artificial intelligence is — and will continue to be — an extraordinarily powerful tool with the potential to provide great benefits to society and one that also poses a serious danger if appropriate guardrails are not constructed while the technology is still young. Importantly, the creators and most enthusiastic promoters of AI all agree that regulation is needed to ensure that AI does not exceed boundaries that threaten national security, the integrity of voting infrastructure, or the trust of the public in the veracity and reliability of the information they receive.  

 

While the U.S. government is progressing in its development of a regulatory framework, the EU seems far ahead in the process of instituting an enforceable set of AI regulations. The EU AI Act may be the first comprehensive set of regulations to limit the use of AI systems and mandate disclosure of content as AI generated.  The regulatory scheme is yet to be formally enacted but it imposes higher obligations on both providers and users as the level of risk increases. The plan bans harmful AI practices such as 'Real-time' remote biometric identification systems in publicly accessible spaces for law enforcement purposes, except in a limited number of cases.

 

Concerns exist that the EU’s regulatory approach may complicate efforts to establish commonly shared regulatory protocol because the EU did not consult extensively with all AI industry leaders in drafting its rules.

 

Key Steps for Law Firms

The value of AI as a productivity engine is unquestionable. AI can complete dozens of hours of administrative work accurately and efficiently in minutes, freeing up attorneys to apply their expertise to a more valuable functions such as helping clients, giving better advice, honing their skills — all the things that lawyers went to law school to do.

 

The technology has already demonstrated a remarkable capacity for producing comprehensive drafts of legal memoranda and research-based documents that assist attorneys in writing their own final versions.

 

However, an AI is not now and may never be reliable enough to draft a document appropriate for submission to a court or a client without an attorney thoroughly checking every fact and legal citation. AI is merely a tool that can reduce an attorney’s research and writing time, but it is not a substitute for years of legal practice and human experience.

 

Data privacy and cybersecurity

To ensure a firm’s leaders and their entire staff of associates and administrators observe best practices when using AI, law firms should conduct educational programs on AI.

 

Every firm should produce and distribute an AI handbook addressing the firm’s policy on AI procedures, including strong instructional guidelines. In addition to not relying solely on an AI-drafted filing or memo ad mentioned earlier, at the top of that list is that no confidential information or client identification data should ever be entered into a ChatGPT or generative AI program.

 

In firms with 10 lawyers or more, there should be an individual on board who is a tech-savvy specialist to oversee the use of AI and to help educate new and existing associates and support staff.

 

Ralph Losey, a senior partner at Losey PLLC and a leading expert, lecturer, and author on legal tech told the Miami Dade Bar Association conference attendees that, “AI is like any advanced tool, it takes study and it takes training.”

 

Experts who participated in the Miami Dade Bar Association panel recommended that any AI tool should be viewed like email: “You wouldn’t put anything into an email that would jeopardize the attorney-client privilege. It’s the same thing with AI. Once you put client data or anything confidential into ChatGPT, it’s going to be out there for anyone to use.”

 

Vetting AI providers

Law firms need to be thorough in their consideration of which AI programs to adopt. AI vendors often exaggerate the potential benefit of their product or describe the program’s security features as being more robust than they truly are.

 

Serious AI product vendors should be willing and able to answer any question you have about their product and AI in general. Consider what your firm’s needs are and what pain points the product will serve to remedy. And inquire about the specific plan the vendor has for rolling out the product to the firm’s entire staff.

 

The AI service provider your firm selects should also be prepared to instruct the firm’s users about how to construct the most accurate and effective prompts for them and their tasks. Good prompt engineering is a valuable skill that provides the AI program with a more refined, detailed, and specific request that will provide the most useful and helpful results.

 

Keep in mind that there are new applications being released regularly, new tools with new features. Investigate the market’s entire offering before committing to a particular AI program. There may be a product designed to meet your needs more completely than the one you first encounter.   Check out TheresAnAIForThat or TheoremLegal for more information on available AI tools.

 

Don’t Be Afraid 

Some support staff, and more lawyers, might be a bit skittish about using AI in their legal work. That is understandable, given the horror stories that have been passed around and the overall fear that AI will replace some jobs.

 

However, there is no reason to be afraid. Whether they realize it or not, lawyers and support staff have been using some form of AI for years: 

  • That billing system that somehow magically estimates the number of billable hours a certain matter will take or that the attorney will end up billing for the year.
  • A Google search.
  • Even texting (how do you think the phone knows that you wanted to type your partner’s name?).

That’s all AI.

 

The panelists at the Miami Dade webinar suggested something that can ease the fear: Play with it. Go to ChatGPT, or Bard, or any other generative AI tool and ask it what you should make for dinner. Seriously. Give it the ingredients you have on hand and ask it to come up with recipes that you can make. One member even used it to generate a name for a new gin cocktail containing a specified list of mixers.  Ask it to write your biography (remember Rule #1 and don’t input any personal information that’s not already on your LinkedIn profile). Anything non-work related. Once you get comfortable using it in that way, then think of ways you can use it at work. Start slow, but start. Then maybe you’ll see that AI isn’t some monster hiding under your desk. 

 

AI in the Courts

With AI challenging so many established norms, questions about AI’s legal limits, its use of others’ creative work to “learn,” and whether intellectual property on the internet is protected from AI’s mining activities are all being litigated in courts throughout the U.S.

 

Sara Silverman sued OpenAI (the creator of ChatGPT) alleging that the program’s use of her recorded material to “teach” ChatGPT was an unauthorized use of her copyrighted work.See, Silverman v. OpenAI Inc, U.S. District Court for the Northern District of California, No. 3:23-cv-03416.

 

In June 2023, Mark Walters, a radio personality and vocal proponent of gun rights, sued the owner of ChatGPT for defamation because the program created an entirely false and allegedly libelous biography of Walters including a list of illegal conduct in which it said he was involved. It was an AI “hallucination.” See, Walters v. OpenAI, L.L.C., 1:23-cv-03122, (N.D. Ga.).

 

And a class action lawsuit was filed against OpenAI and Microsoft in California alleging that the program operates by stealing private data from hundreds of millions of people without their consent. See, P.M. v. Open AI, L.L.C.,  U.S. District Court for the Northern District of California, No. 3:23-cv-03199.

 

AI is testing the limits of current law and inviting the development of new legal theories that will undoubtedly reshape the way view information.

 

The main takeaways from the Miami Dade Bar Association conference and for our readers include a wake-up call that AI is a transformational development that will impact every legal practice. Lawyers who use AI well will have a competitive edge on those who fail to integrate the technology into their daily practice.

 

Firms need to ensure that the AI they take onboard is controlled and operated in line with clear policies and guidelines to safeguard their clients’ confidential information and preserve the attorney-client privilege. Vetting of AI product vendors should be sharply focused on the firm’s needs, the products actual capacity for productivity and cybersecurity, and the system’s ease of use.

 

AI is here to stay. Get to know it, work with it informally and become familiar with how it works and what use you may make of it. Your competition is certainly going to.

Marketing That Leads to Sales

Melissa Rogozinski