On January 11, 2026, The Express Tribune reported comments from Blue Archive producer Yongha Kim warning that careless generative AI use is lowering game content quality and eroding player trust. Kim said Nexon Games uses AI mainly for support tasks like speech synthesis and recognition, not to replace core creative work.
This article aggregates reporting from 2 news sources. The TL;DR is AI-generated from original reporting. Race to AGI's analysis provides editorial context on implications for AGI development.
Yongha Kim’s “AI slop” critique captures a growing backlash in creative industries: fans aren’t angry at AI in the abstract, they’re angry at lazy, low‑effort content that uses AI as a shortcut. In gacha games—where players pour time and money into storylines, character art and music—the sense that a studio has quietly swapped craftsmen for prompts can feel like a betrayal. ([tribune.com.pk](https://tribune.com.pk/story/2586455/gacha-game-blue-archive-producer-warns-careless-ai-use-is-eroding-player-trust))
For the AGI race, this is a reminder that capability isn’t the only constraint. Even if models can generate acceptable assets at near‑zero marginal cost, audiences may force developers to retain a visible layer of human authorship, at least in narrative and aesthetic domains. Nexon Games’ approach—using AI to speed up support functions like TTS and internal tools while keeping core creative decisions human—looks like a template for how mainstream studios might thread that needle. That doesn’t slow research on more general systems, but it may limit where and how aggressively those systems are deployed in commercial content pipelines in the near term.
Strategically, companies that invest in transparent, high‑quality AI workflows—and that openly explain what is and isn’t automated—are likely to win loyalty over those that quietly swap in generic AI output. Trust capital becomes as important as model capital.

