November 2025

Insights from our Panel, ‘Practicing in the Age of AI: Navigating Risk, Roles and Regulation’

EuropeWebinars & EventsLegalPrivate PracticeHiring Advice
Insights From Our Panel, ‘Practicing In The Age Of AI Navigating Risk, Roles And Regulation’

The use of AI in the legal sector might be relatively new, but it is already posing difficult questions and moral concerns alongside the opportunities it brings.

To mark the expansion of Larson Maddox into Europe, while using the occasion to help local partners and legal leaders understand these challenges and guide the effective use of AI at their firms, we recently held an in-person breakfast panel in London exploring ‘Practicing in the Age of AI: Navigating Risk, Roles and Regulation’.

Louise Shearing, Director and Head of UK Legal at Larson Maddox, moderated the discussion, and shares why AI was chosen as the focal theme: 

Larson Maddox London wanted to bring stakeholders together and provide insight on a hot topic of discussion, AI. The legal market is highly regulated and with AI evolving and changing the legal landscape it’s an interesting and challenging one to navigate. We wanted to provide real-time examples which can help educate everyone in the room with key changes, whilst sparking important debates on how or when to use AI.

Read on to discover some of the key discussion points, takeaways, and learnings from the day.

About the panelists

Tara Waters

Tara is a former technologist, law firm partner and Chief Digital Officer who has a strong reputation and track record in legal innovation and digital transformation, enhanced by her background working in both the technology and legal industries.

Tara currently runs her own digital transformation consultancy, supporting law firms, businesses and technology companies on strategy, governance and execution. Over the past few years she has had a particular focus on AI and has led tool selection and pilot processes in multiple firms, as well as one of the first global rollouts of a legal AI platform.

Egbe Manton 

Egbe is a lawyer and founder with over 13 years’ experience at the intersection of law and social impact. She is the Founder of Inspire Law Global, a platform advancing legal talent through coaching, curated learning, and AI adoption. Egbe designs and delivers high-impact programs that equip emerging legal professionals with the skills, mindset, and AI technological tools they need to thrive in a changing legal landscape.

Her work bridges deep legal expertise with a human-first approach to leadership and access. Known for building inclusive talent pipelines and fostering cultural change, Egbe champions AI innovation and empowerment across the legal profession. Her core focus: AI, community, and preparing future lawyers for the evolving world of work.

Nigel Spencer

Nigel is a multi award-winning expert in leadership development in the legal sector. A qualified Executive Coach, Nigel is also currently Professor of Professional Practice at Queen Mary’s Law School, and over many years he has carried out extensive research into the changing career paths and future skills needs for legal professionals.

Previously, Nigel led global talent development functions at law firms Simmons & Simmons LLP and Reed Smith LLP, was a Director of Executive Education at Saïd Business School, University of Oxford, and recently completed a book on being an effective leader in the legal sector. He has also held various Board-level roles at a sector level, advising on effective talent development and assessment strategies, and also has developed award-winning technology tools for skills and legal careers development.

Crowley Woodford

Crowley is a partner in the London office of Ashurst, where he leads our London and European employment practice. He specializes in all aspects of contentious and non-contentious employment law with a particular emphasis on discrimination, employee disputes, and workforce restructuring.

Crowley has extensive experience in corporate governance matters, executive arrivals and departures, whistleblowing, discrimination claims, protecting confidential information, and introducing change management to the workplace. Crowley is also an expert in leading complex internal investigations including within a regulatory environment.

Expert insights on AI’s impact on risk, roles, and regulation in the legal sector

AI regulation and accountability in legal practice

Louise opened the panel’s discussion by asking how AI regulation is already impacting the legal practice, with Crowley noting that firms are investing significantly in AI across research, drafting and document translation, despite clients having a mixed response to AI use. Some request clear confirmation when AI has been involved, while others expect its use to increase speed and reduce cost. This is prompting firms to introduce AI-use clauses to formalize accountability and protect both sides.

Tara added a reminder that the EU AI Act, enforced in phases since 1 August 2024, governs both the development and operational use of AI based on risk categories. While the Act is still being rolled out, certain applications are now prohibited, such as automated hiring and sentiment monitoring, and training is now a legal requirement. She’s advising her clients to implement compliance frameworks in these areas, however as AI models update regularly this is making it difficult to keep pace, and best practices are still being developed.

AI training and skills development for junior lawyers

The second focus of the panel was how education programs are adapting to the age of AI. Nigel encourages his students to use AI and then critique it, helping them to develop the skill to evaluate AI outputs rather than accepting them. He also highlighted the value of work placements so students can observe real workflows and learn how practitioners review and question AI-supported work. Nigel strongly encourages firms to offer apprenticeships and placements to students to allow them to apply their learnings in practice, build their confidence, and develop new connections. 

Another important theme raised was the mental impact of AI on junior talent development, with Egbe bringing attention to the pressure junior lawyers face today. Gen Z are not only expected to train and understand the technical side of their work, but also be AI-savvy and able to produce work quickly and accurately. It’s vital for senior leaders to ensure their junior teams have the support, training and structured guidance they need, including in areas related to AI such as AI accountability. 

When it comes down to implementing this training, some firms already have specific training programs, Tara added, but true understanding and capability comes only from hands-on use. Partner-level supervision is also important when training juniors but requires partners to themselves be properly trained. However, as generative AI is still so new, internal subject matter expertise is often limited. 

Managing AI risk and legal quality control

The conversation then moved to risk, where Crowley highlighted the importance of maintaining quality as AI adoption accelerates. Although the firms that have already prioritized AI can currently charge a premium for their services as a USP, he questions whether the market will become a race to the bottom on prices as AI becomes more widely embedded, reducing cost at the expense of standards. 

The challenge he described is twofold: firms need strong review procedures to protect the quality of legal work, and they must do so while meeting different client expectations about how AI should be used. One solution is to run comparison exercises where teams complete the same task manually and with AI, then assess differences. This practice helps validate accuracy and provides reassurance to clients that quality is not being compromised.

Tara explained why safeguards like these are essential, advising that hallucinations are an inherent feature of generative AI, rather than a bug as some might think, and there are already at least 460 recorded cases of fabricated citations making their way into court filings. Because AI outputs appear credible, human review must always remain in place, but this task is often getting passed to junior lawyers when their seniors don’t have enough time. This, Egbe noted, creates risks for both client delivery and professional development. 

Evolving legal roles, AI capability and talent strategy

The final part of the discussion focused on how AI is changing legal roles and skill expectations. Tara emphasized that accountability for AI adoption needs to sit across the entire business, not only within technical teams. Some firms are beginning to define competencies that all lawyers are expected to meet, while others are creating new roles such as Legal Technologist or AI Governance Lead. These roles require technical fluency and commercial understanding and are still scarce in the market, so firms may need to review where they can develop and upskill internally, alongside targeted external hiring.

A firm’s AI capabilities and training are also now directly influencing their talent pipelines, whether knowingly or not. Egbe shared that junior lawyers are increasingly choosing employers based on how well they support AI skill development. While a competitive salary used to be primary factor when making career decisions, now many are actively picking firms that are investing money into AI training and tools where they can develop their skills.

Crowley concluded by adding that effective AI adoption, reducing the risks associated with its use, and ultimately return on investment depends on senior lawyers understanding how to integrate these tools effectively. Overseeing adoption, setting standards, and ensuring adequate oversight must be implemented from the top if it is to improve workflows in a meaningful way.

Final conclusions on AI’s role in the legal sector

The panel agreed on several conclusions:

  • AI will not replace lawyers, but it will change how legal work is carried out.
  • It’s critical for any legal professionals using AI to develop the skills to question, evaluate, and correct AI outputs.
  • Knowing when to use AI – and when not to – is critical. Organizations must be transparent about how AI is being applied.
  • Effective AI implementation and return on investment requires AI to be understood and embedded from the top down, with clear review frameworks, well-designed training, and consistent supervision.

The final shared message was that AI introduces new efficiencies, but human judgment will always be at the core of legal practice. One will never replace the other, so firms will need to find a way of working with AI that strikes the right balance, provide training and support to reduce business risks, and continue to move forward with the best interests of clients at heart.

Build future-ready legal teams with Larson Maddox

Thank you to our panel speakers for their insight and openness, and our attendees for contributing to a thoughtful and engaged discussion.

As the legal sector continues to navigate the era of AI, Larson Maddox is here to help organizations define the skills and roles they need, including hybrid legal profiles and emerging positions in AI governance and legal technology.

To discuss your upcoming hiring needs or to explore current regulatory & legal opportunities in Europe and around the globe, request a call back or browse our open roles.

Find the talent you need

Get in touch with one of our talent specialists who will be able to offer bespoke guidance on hiring for your business' objectives and resourcing plans.

Looking for a new role?

Start your search today and take the next step in your career - browse regulatory & legal roles at industry-leading firms with Larson Maddox.