Saturday, March 7, 2026

Humans apply artificial intelligence to falsely identify the federal agent who shot and killed Renee Good

Share

In hours after a masked federal agent shot and killed Renee Nicole Good, a 37-year-old woman in Minneapolis, social media users are sharing artificial intelligence-altered photos, falsely claiming to be “unmasking” the officer by revealing their true identity. The agent was later identified by Department of Homeland Security spokeswoman Tricia McLaughlin as an Immigration and Customs Enforcement officer.

The shooting took place on Wednesday morning around recordings from the scene on social media shows two masked federal agents approaching an SUV parked in the middle of a road in a suburb south of downtown Minneapolis. One of the officers appears to ask the driver to exit the vehicle before grabbing the door handle. At this point, the driver appears to be reversing, then driving forward and turning. A third masked federal officer, standing in the front of the vehicle, draws a gun and fires into the vehicle, killing Good.

Videos of the incident shared on social media moments after the shooting did not include any footage of any of the masked ICE agents without masks. However, within hours of the shooting, multiple photos of the unmasked agent began circulating on the Internet.

The images appear to be screenshots from actual video footage, but modified using artificial intelligence tools to create the officer’s face.

WIRED reviewed multiple AI-altered photos of the unmasked agent shared on all major social media platforms, including X, Facebook, Threads, Instagram, BlueSky and TikTok. ‘We need his name’ – Claude Taylor, founder of anti-Trump Mad Dog PAC he wrote in a post on X containing an image of an agent altered by artificial intelligence. The post has been viewed over 1.2 million times. Taylor did not respond to a request for comment.

On Threads, an account called “Influencer_Queeen” posted an AI-altered image of the agent and wrote, “Let’s get his address. But let’s just focus on HIM. Not his kids.” The post was liked almost 3,500 times.

“AI-based enhancements tend to hallucinate facial details, leading to an enhanced image that may be visually clear, but may also lack reality in terms of biometric identification,” Hany Farid, a professor at the University of California, Berkeley, who has worked in the past studied the ability of artificial intelligence to enhance facial imagessays WIRED. “In this situation, where half of the face is obscured, neither AI nor any other technique can, in my opinion, accurately reconstruct the identity of the face.”

Some people posting the photos also claimed without evidence that they had identified the agent, naming real people and, in many cases, sharing links to those people’s social media accounts.

WIRED confirmed that two of the names circulating do not appear to be directly linked to anyone associated with ICE. While many of the posts sharing these AI images have restricted engagement, some have gained significant traction.

One of the names shared online without evidence is Steve Grove, CEO and publisher of the Minnesota Star Tribune, who previously worked in Minnesota Gov. Tim Walz’s administration. “We are currently monitoring a coordinated online disinformation campaign that incorrectly identifies the ICE agent involved in yesterday’s shooting,” Chris Iles, vice president of communications at the Star Tribune, tells WIRED. “To be clear, the ICE agent has no known affiliation with the Minnesota Star Tribune and is certainly not our publisher or CEO Steve Grove.”

This isn’t the first time artificial intelligence has caused problems after a shooting. A similar situation occurred in September, when Charlie Kirk was killed, and an artificial intelligence-altered image of the shooter, based on grainy video footage released by law enforcement, was widely shared on the Internet. The AI’s image bore no resemblance to the man who was ultimately captured and charged with Kirk’s murder.

Latest Posts

More News