In an unusual corporate saga that highlights the risks of relying on artificial intelligence for high-stakes decisions, the Chief Executive Officer of Krafton, the company behind the popular battle royale game PUBG: Battlegrounds, reportedly turned to ChatGPT for legal counsel. This unorthodox move was intended to facilitate the dismissal of the head of the studio responsible for developing Subnautica 2.
AI-Driven Corporate Maneuver
The CEO’s objective was to remove the head of the Unknown Worlds Entertainment studio, which is actively working on the anticipated sequel, Subnautica 2. Instead of consulting with traditional legal experts, the executive chose to seek guidance from an AI chatbot. This decision, seemingly aimed at streamlining a complex personnel change, ultimately set the stage for a significant corporate misstep.

Photo: 404media.com
Disregarding Expert Legal Guidance
Crucially, the CEO’s reliance on ChatGPT directly contravened the professional advice offered by his own legal department. Company lawyers are typically tasked with navigating the intricate landscape of employment law and contractual obligations to prevent costly disputes. By sidestepping their experienced counsel in favor of an AI-generated strategy, the CEO initiated a course of action fraught with peril, ignoring the very expertise designed to protect the company’s interests.
The Unfavorable Verdict
The outcome of this AI-informed strategy was stark: a resounding failure in court. The legal challenge mounted by the CEO, based on recommendations from ChatGPT, was unsuccessful, leading to an unfavorable judgment. This incident serves as a cautionary tale, underscoring the critical importance of human legal expertise in complex corporate matters and the potential pitfalls of substituting it with artificial intelligence, particularly when multi-million dollar contracts are at stake.
