Home » Training Attorneys to Use AI-Powered Technology
,

Training Attorneys to Use AI-Powered Technology

Estimated reading time: 8 minutes

Almost two years after ChatGPT’s launch redefined public understanding of what artificial intelligence is capable of, it’s reassuring — and unsurprising — that robots have not replaced lawyers. The technology’s limitations, including hallucinations and a lack of nuanced judgment, underscore the value of attorneys’ acquired knowledge and human judgment.

What has changed is the extent to which law firms leverage AI to support, rather than replace, their lawyers, as those who fail to adapt risk falling behind. But implementing new technology at your firm isn’t as simple as flipping a switch. It requires adopting a design-thinking process, careful training and thoughtful integration.

While you don’t need to be an expert in AI to apply it to your practice, you do need to have a reasonable understanding of its capabilities and limitations. That’s according to the American Bar Association (ABA)’s July 29 opinion, which outlines key ethical obligations for lawyers. Much of this guidance aligns with the State Bar of California’s Practical Guidance for the Use of Generative AI in the Practice of Law, issued in 2023. Meanwhile, according to a recent ABA survey, law schools are increasingly teaching law students about AI.

Here are five important considerations for California firms training their attorneys on using AI-powered technology.

Teach the art and science of prompt design

When it comes to legal technology powered by generative AI, the outputs you receive will only be as good as the prompts you provide. And you guessed it — writing prompts for legal scenarios requires a more nuanced approach than asking for book recommendations or recipes.

Train your lawyers to structure prompts that reduce the risk of inaccuracies by being as specific, clear and context-rich as possible. For example, if you want to assess compliance with California’s AB5, which redefines independent contractors, your prompt might look something like this:

“I am an in-house lawyer in California assessing whether a communications company’s independent contractor agreements comply with California Assembly Bill 5 (AB 5). Please analyze the following document for compliance. AB 5 redefines independent contractors and uses the ‘ABC test’ to determine whether a worker is properly classified as an independent contractor or an employee.

In your analysis, include the following:

I. Key provisions: Outline how the document aligns with or deviates from AB5’s requirements, including the ‘ABC test’ criteria:

A. The worker is free from control and direction of the hiring organization in connection with the performance of the work, both under the contract for the performance of the work and in fact.

B. The worker performs work that is outside the usual course of the hiring entity’s business.

C. The worker is customarily engaged in an independently established trade, occupation or business.

II. Detailed comparison: Compare the document’s provisions against each ABC test criteria. Highlight any discrepancies or areas that might indicate noncompliance.

III. Specific examples: Provide specific examples from the document that illustrate how the company either meets or fails to meet each criterion of AB5.

IV. Recommendations: Suggest any necessary changes or actions that the company should take to ensure compliance with AB5.

Here is the text of the document: <INSERT>.

For complex and confidential legal work, ensure that your firm avoids public-facing tools such as ChatGPT and Claude because these systems are not designed or equipped to guarantee the security and privacy of sensitive data. The California Bar’s AI guidelines caution lawyers never to input confidential information into a prompt.

Prompt training should also be part of an ongoing conversation with your legal technology vendor, which can measure and tweak AI performance and prompt effectiveness to suit your firm’s needs.

Underscore AI’s limitations

Lawyers have a duty of competence and candor to check the accuracy of all AI-generated content before relying on it in court or as the basis for advising clients, according to the ABA’s opinion and the California Bar’s guidance. They also emphasize the importance of understanding the risks of using any technology, AI or not.

Use your training programs to outline the limitations of AI and teach lawyers how to critically assess outputs, including analysis and citations, correct errors, misstatements of fact and law, misleading arguments and failure to include controlling legal authorities. Don’t forget legal research as well, where AI can provide significant benefits. Training here should include instruction regarding AI tools with traditional research methods to enhance efficiency without compromising accuracy.

Do more than hand people booklets to read or PDFs of PowerPoints to go through — have lawyers learn by doing. Consider offering regular workshops, running roleplay scenarios or attending seminars demonstrating flawed outputs and how to spot them. Use simulated legal cases where AI tools provide analysis and have lawyers critique the output, highlighting errors or omissions. Your firm might also develop a critical review checklist that lawyers can use to scour AI-generated content for accuracy and cross-check sources for reliability and relevance.

The extent to which lawyers will need to review AI-generated outputs will depend on the tool they’re using and what they’re using it for. As the ABA opinion notes, tools designed “for the practice of law or to perform a discrete legal task, such as generating ideas, may require less independent verification or review, particularly where a lawyer’s prior experience with the [generative AI] tool provides a reasonable basis for relying on its results.”

To be accurate enough for lawyers to rely on, AI tools should be specifically designed for the legal industry — and so should their mechanisms for preventing and addressing errors. One of those mechanisms includes supervised machine learning, where a human trainer actively guides the AI, teaching it the correct responses until it can make accurate decisions independently. Another common safeguard is user validation, in which the AI repeats requests back to the user to ensure it has correctly understood the input before generating a response. Your firm might also consider training a large language model (LLM) on your specific data to produce highly customized responses.

But even legal tools using generative AI are prone to accuracy issues — one recent Stanford University study found they hallucinate between 17% and 33% of the time.

Construct clear ethics guardrails

Client communication and consent should be a vital part of AI training for attorneys, as the ethical implications are complex and evolving. The ABA and the California Bar have said that lawyers must consider whether they have a duty to disclose generative AI use to their client or obtain their informed consent to use a particular tool, reasoning that it will depend on the facts of each case.

Clear and transparent billing practices are also essential to maintaining trust and fairness in how AI tools are accounted for, according to the ABA’s opinion and the California Bar’s guidelines, which say fees must reflect both the time invested and value delivered.

Establish guidelines for your firm and train your attorneys to navigate these ethical responsibilities, including when and how to disclose AI use and obtain informed client consent while maintaining professional integrity. It’s also important to train and supervise nonlawyer staff on using AI tools to ensure compliance with ethical standards.

Hammer home the importance of privacy and data security pitfalls

As legal information is incredibly sensitive, think twice before inputting any data into an AI-powered tool. Your firm should have a full understanding of where the information shared with that tool is stored and how it is shared.

Develop best practices and protocols so your attorneys have clear guidelines for safely using AI tools, including how to minimize data exposure and protect client confidentiality. Choose private, secure platforms that keep data on servers controlled by the firm rather than public-facing ones like ChatGPT. Even then, train attorneys not to share sensitive or confidential information unless absolutely necessary.

When vetting service providers, consider how stringent the security protocols are to reduce the risk of leaks and external hacks that could expose your firm to liability. Look for vendors that are at least SOC 1 Type 2 and SOC 2 Type 2 compliant and have ISO/IEC 27001 certification. Ideally, they will carry internationally recognized security certifications.

Encourage open dialogue

If attorneys and staff feel free to discuss their experiences, challenges and successes with AI tools, your firm will benefit from creative solutions and best practices. And since the technology is evolving rapidly, continuous learning and innovation isn’t just nice to have; it’s vital for remaining competitive.

Promote open dialogue by establishing regular meetings or forums for sharing insights and feedback on AI tools. Encourage transparency in discussing any limitations and ethical concerns to ensure all perspectives are considered in the decision-making process and risks are identified and addressed early on. This approach ensures that everyone, regardless of their level of tech comfort, is on board and able to contribute to the successful integration of AI.

AI’s place in the legal profession is still uncertain, and even the ABA opinion acknowledges that there could come a day when lawyers will need to use generative AI to competently complete certain tasks for clients. But in the meantime, law firms that train their attorneys to understand AI’s capabilities and limitations will be better positioned to adapt and thrive.

At CEB, we believe AI is a transformative and exciting technology for the legal profession, but attorneys remain the primary content creators and curators of our legal content and offerings. Get in touch to schedule a free demo