Krafton CEO's ChatGPT advice led to a 3.7 billion Won liability.

The Story
This financial penalty stemmed from advice given by the AI.
Why It Matters
This isn't just a headline for a gaming company. It's a clear signal for any European business, especially law firms, about the direct financial and reputational risks of ungoverned AI use. Relying on AI for advice without human oversight can lead to significant liability, as Krafton's 3.7 billion Won penalty shows. For your firm, this means setting clear internal policies on AI use with client data and advice, well ahead of the EU AI Act enforcement.
What To Do About It
Start by auditing where AI is currently used in your firm, even informally, for client-facing tasks or advice generation. Then, establish an internal AI governance committee. Their first task should be to draft a "no client data on public LLMs" policy and review your Microsoft 365 Copilot settings for data privacy.


