On April 3, 2026, Illinois Attorney General Kwame Raoul warned that federal agencies are buying bulk data from private brokers and using AI to infer detailed profiles of Americans without warrants. He and 16 other state attorneys general are urging Congress to close these loopholes and require deletion of unlawfully collected data.
This article aggregates reporting from 1 news source. The TL;DR is AI-generated from original reporting. Race to AGI's analysis provides editorial context on implications for AGI development.
This story is less about a new model and more about the data exhaust that feeds AI systems. Raoul is pointing at a structural shift: once you combine bulk location, purchase and behavioral data with modern inference models, you no longer need wiretaps or subpoenas to reconstruct someone’s life—you can statistically infer it. That collapses practical privacy even if the formal legal standards haven’t changed.
For the race to AGI, this matters because surveillance and targeting are among the most lucrative near‑term uses of large‑scale AI. If bulk data purchases remain weakly regulated, the default business model rewards ever more invasive profiling, which in turn creates rich training corpora for behavioral prediction. That feedback loop could accelerate capabilities in modeling human behavior, persuasion and social control—areas many people worry about in an AGI context.
The political response Raoul is calling for—treating data brokerage as a Fourth Amendment issue—could nudge the U.S. closer to an EU‑style stance where certain data practices are flatly off‑limits regardless of technical feasibility. That would not stop AGI work, but it would shape which applications scale first and how deeply general‑purpose models become entangled with state surveillance.

