44
Lawyers who better understand AI will feel more
confident using generated or automated responses,
leading to more efficient, effective, and accurate legal
work. This confidence is crucial for integrating AI
tools effectively into daily practice, as it encourages
greater adoption and maximizes the benefits of these
technologies, ultimately leading to improved client
outcomes and reduced risk.
THE ETHICAL IMPERATIVE FOR AI
EDUCATION
While keeping up with (or getting ahead of) your
competition is probably enough of a reason to convince
lawyers to learn about AI technologies, there is also an
ethical requirement. Comment 8 to Model Rule 1.1 of the
American Bar Association's Model Rules of Professional
Conduct (https://www.americanbar.org/groups/
professional_responsibility/publications/model_
rules_of_professional_conduct/rule_1_1_competence/
comment_on_rule_1_1/) states that lawyers "should
keep abreast of changes in the law and its practice,
including the benefits and risks associated with relevant
technology." This duty of technology competency
means that lawyers cannot entirely delegate their
understanding of AI to their team, IT, or vendors.
Lawyers are wholly responsible for the work they
submit to courts. If they, or any member of their team,
including non-lawyers, use AI, the resulting work must
be reviewed and approved by them. That is easy to say,
but putting it into practice is more nuanced.
AI technology is evolving rapidly, making continued
education essential for lawyers reviewing AI-generated
or AI-assisted work products. Without up-to-date
knowledge of how AI works and is being used, they
may not understand where to look for errors, and red
flags will not be as obvious.
For example, reliance on unchecked AI output can lead
to inaccurate legal research, faulty document drafting,
or even the presentation of fabricated case citations, all
of which can result in poor client outcomes—or worse—
damage to professional reputation and potential ethical
violations. Teams must understand this and leverage
AI as a companion or a second set of eyes when
creating work products. Reviewing and verifying results
and outputs is essential, along with understanding
how the AI you are using works and what materials it
sources for its answers.
HOW AL FLUENCY HELPS LEGAL
PROFESSIONALS MEET CLIENT
EXPECTATIONS
The recent Norton Rose Fulbright Litigation Trends
Survey reveals that nearly three-quarters of respondents
support the use of generative AI by outside counsel to
assist their company's litigation work (https://www.
nortonrosefulbright.com/-/media/files/nrf/nrfweb/
knowledge-pdfs/norton-rose-fulbright---2025-annual-
litigation-trends-survey.pdf).
Lawyers can also present the idea that they are
delivering higher-value services through the use of
GenAI without incurring additional billing hours. Robert
Couture, a Senior Research Fellow at the Center on the
Legal Profession at Harvard Law School, conducted
qualitative interviews with chief operating officers and
partners responsible for AI deployment from 10 Am Law
100 firms. He quoted one law firm leader as saying: "AI
may cause the '80/20 inversion'; 80% of time was spent
collecting information, and 20% was strategic analysis