Epic Games CEO Tim Sweeney argues banning Twitter over its ability to AI-generate pornographic images of minors is just 'gatekeepers' attempting to 'censor all of their political opponents'
Not the hill I'd die on, but I'm not a billionaire.

(Image credit: Bloomberg (Getty Images))
As reported by Eurogamer, Epic Games CEO Tim Sweeney took to X (formerly Twitter) to criticize an attempt by US lawmakers to ban the social media app and its accompanying generative AI tool, Grok. The move came after users discovered that Elon Musk's Grok has the ability to take photos of real people, including minors, and produce images of them undressed or in otherwise sexually compromising positions, flooding the site with such content.
"Reason #42 for open platforms: to shut down every politician’s incessant demands to all gatekeepers to censor all of their political opponents," Sweeney wrote in a first tweet responding to MacRumors' report of US politicians requesting that Apple and Google remove X and Grok from their app stores.
All major AIs have documented instances of going off the rails; all major AI companies make their best efforts to combat this; none are perfect. Politicians demanding gatekeepers selectively crush the one that's their political opponent's company is basic crony capitalism.January 9, 2026
"All major AIs have documented instances of going off the rails," Sweeney continued in a follow up tweet. "All major AI companies make their best efforts to combat this; none are perfect. Politicians demanding gatekeepers selectively crush the one that's their political opponent's company is basic crony capitalism."
"AI going off 'guardrails' is not the same as actively excusing content for pedophiles," wrote Remap and former Waypoint editor, Patrick Klepek, in response. "Your priorities as someone in charge of a company that makes a video game catering to young people are completely off."
404 Media's report from January 5 at the beginning of this saga offers illustrative examples of Grok's newly-discovered capabilities, like influencers undressed, made to appear pregnant, or shown breastfeeding a child. There are also extensive reports of users generating such material from images of minors.
The Rape, Abuse, & Incest National Network (RAINN) defines child sexual abuse material (CSAM) as "evidence of child sexual abuse" that "includes both real and synthetic content, such as images created with artificial intelligence tools." One of RAINN's examples of CSAM is "any content that sexualizes or exploits a child for the viewer's benefit."