who owns what when you use ai?

In collaboration with UCL's Centre for Sustainability and RealTech Innovation, we are bringing you the first of a three part webinar series to showcase our new executive education course.

Days

Hours

Minutes

Seconds

REGISTER NOW

about the event

Now comes the question that too many organisations ignore until it’s too late.

Who actually owns the outputs your AI generates?

​- The content, code, or designs created by AI tools​
- The intellectual property embedded in training data​
- The liability if something goes wrong.​

These questions aren’t abstract. Organisations are already facing disputes, vendor challenges, and legal uncertainty.

This one-hour webinar gives senior leaders the frameworks and practical insights to make these decisions with confidence. It also serves as a gateway to the Executive Education course, where participants go deeper, applying these frameworks hands-on to their own organisations.

date

22nd April 2026

time

12:00 - 13:00 GMT

location

Online
Dafina Jasari
transaction lawyer, montana capital partners
Coming soon...
Professor Eva Micheler
professor of law, london school of economics (Lse)
Eva Micheler is a Professor of Law at the London School of Economics and a member of the management committee of the Systemic Risk Centre at LSE. Professor Micheler has written widely on corporate and comparative law, with intermediated securities and holding and transfer systems as a significant focus of her work. Her scholarship has been cited by the UK Supreme Court and the Austrian Oberster Gerichtshof, and she contributes to leading legal texts including Gower and Davies, Principles of Modern Company Law and Gore-Brown on Companies. She is an Ausserordentlicher Universitätprofessor at the University of Economics in Vienna, a member of the board of the Institute of Central and East European Business Law in Vienna, and a member of the Investor Protection and Intermediaries Standing Committee at the European Securities Markets Authority. She has also advised the UK Department of Business, Innovation and Skills on questions relating to intermediated shareholdings.
Fabrice Ciais
head of responsible ai, g42
With over 25 years of experience in AI risk, governance and responsible innovation, Fabrice currently serves as Group Head of Responsible AI at G42, where he focuses on building safe and accountable AI systems and advancing AI validation and scaling strategies.

His work centres on enabling organisations to navigate AI-driven transformations with robust governance frameworks and long-term ROI, working closely with regulators, standard setters and technology innovators to ensure responsible and sustainable AI adoption.
Victoria Thompson
Technology & ip lawyer, astrazeneca
Victoria Thompson is a senior lawyer with over 10 years of C-suite advisory experience across a diverse range of financial institutions, from start-up to corporate. She has advised and guided transformation and innovation initiatives across the international arena, with expertise spanning commercial law, intellectual property, data and IT.

Victoria recently completed an academic sabbatical with UCL's Computer Science faculty, researching AI, data, DLT and quantum technology law and policy. She brings board-level experience and a strong track record in corporate governance, business transformation and stakeholder engagement.
why attend?

AI adoption is moving faster than governance, policies, and contracts. Without clarity, organisations risk:​

- Unclear ownership of AI-generated assets​
- Copyright or licensing violations​
- Vendor lock-in or contractual exposure​Legal and reputational risk

​This webinar equips leaders with a structured, practical approach to protect their organisations before problems arise.

What You’ll Learn in 60 Minutes

Our expert panel will address the questions executives are actually asking:

1. Ownership of AI-Generated Outputs
Who owns content, code, and designs generated by AI​How human input affects ownership claims​Where ambiguity exists and how to mitigate it​

2. Copyright and Protection
Can AI-generated outputs be copyrighted or otherwise protected?​Understanding “originality” in an AI context​Practical implications for software, marketing, and creative assets​

3. Training Data and Liability
Risks tied to models trained on copyrighted material​What happens if your organisation is challenged legally​How liability is allocated between vendors and users​

4. Vendor Terms and Negotiation
What contracts and terms of service typically say about IP​Key clauses to watch for to protect your organisation​Negotiating practical protections without slowing adoption​

5. Policies, Guardrails, and Governance
Internal policies organisations should implement now​Guardrails for employees using generative AI tools​Governance structures that scale with adoption​Aligning legal, technical, and business teams around AI risk

what you will walk away with

By the end of this webinar, participants will be able to:​

- Understand ownership and copyright risks associated with AI outputs​

- Assess legal and IP exposure across internal and vendor-led initiatives​

- Evaluate vendor contracts and licensing terms with confidence​

- Implement organisational policies and guardrails to mitigate risk​

- Communicate AI IP considerations clearly to boards, legal teams, and stakeholders

who should attend?

- ​​C-suite and senior executives​​
- Board members and non-executive directors​​
- Strategy, transformation, and innovation leaders​​
- Anyone responsible for approving, funding, or overseeing AI initiatives

This webinar is the first in a broader collaboration with UCL Centre for Sustainability and RealTech Innovation, designed to support leaders who want to move beyond experimentation and build credible, responsible AI strategies.​

For those who want to go deeper, this session connects directly to our Executive Education course, where participants apply these frameworks hands-on to their own organisations with expert guidance.​

The webinar gives you the foundations.

The course helps you implement them.