On December 30, 2025, South Korea’s National Assembly passed an amendment to the Artificial Intelligence Basic Act requiring the participation of vulnerable groups, including people with disabilities and older adults, in AI policy and impact assessments. The law embeds obligations to consider these groups throughout AI policy‑making, data building and high‑impact AI evaluations from January 22, 2026.
This article aggregates reporting from 1 news source. The TL;DR is AI-generated from original reporting. Race to AGI's analysis provides editorial context on implications for AGI development.
South Korea is quietly pushing AI governance into a more participatory direction by hard‑coding vulnerable‑group input into its AI Basic Act. This isn’t about model weights or chip counts; it’s about who gets a voice when systems are designed, deployed and audited. By obligating policymakers to involve people with disabilities, older adults and other at‑risk groups at the planning and data‑collection stages—not just after harms occur—the law tries to move bias mitigation upstream instead of relying on ad‑hoc fixes.



