AI News Feed

AI Upscaling Misleads Charlie Kirk Investigation

12 Sep 2025- Users applied AI "upscaling" to the FBI’s blurry Charlie Kirk suspect photos, generating speculative, often inaccurate faces that shouldn’t be treated as evidence or definitive identification.

General
Trending
12 Sep 2025

The FBI posted two blurry surveillance photos on X of a person of interest in the fatal shooting of Charlie Kirk. Almost immediately, X users — some using X’s Grok bot and others using tools like ChatGPT — replied with AI-“upscaled” versions that turn pixelated images into sharp, high‑resolution faces. Those “enhancements” are often more guesswork than recovery: generative models infer likely features from patterns in training data, and can produce realistic but false details.

Some viral attempts are plainly wrong (one shows a different shirt and a “Gigachad‑level” chin), and the piece notes prior failures of AI upscaling — including a low‑res photo of Barack Obama rendered as a white man and an AI adding a nonexistent lump to Donald Trump’s head. While AI interpolation can sometimes be useful, these outputs shouldn’t be treated as evidence in an investigation or a manhunt; they’re reconstructions, not restorations, and they can mislead searches and spread misinformation. The FBI’s original photos remain the authoritative source for identification.

Source

The method

The prompts

Copied

Copied

Copied

Copied

Copied

Copied

Copied

Copied

Copied

Copied

Copied

Copied

Copied

Copied

Copied