Friday, March 20, 2026

Trae Stephens Built an AI Weapon and Worked for Donald Trump, and He Says Jesus Would Approve of It

Share

When I wrote about Anduril in 2018, the company explicitly said it would not build lethal weapons. Now you build fighter jets, underwater drones, and other lethal weapons of war. Why did you make this change?

We responded to what we saw, not just in our military, but around the world. We want to be consistent with delivering the best capabilities in the most ethical way. The alternative is that someone is going to do it anyway, and we believe we can do it best.

Were there any in-depth discussions before you crossed that line?

There are constant internal discussions about what to build and whether there is an ethical fit with our mission. I don’t think there is much benefit in trying to set our own line when the government is actually setting the line. They have given clear direction on what the military will do. We follow the example of our democratically elected government, which tells us what their problems are and how we can be helpful.

What is the proper role of autonomous AI in warfare?

Fortunately, the US Department of Defense has done more work on this than any other organization in the world, except for the large companies that are doing the basic generative AI model. There are clear rules of engagement that keep people in the loop. You want to get people off the monotonous, sullied, risky tasks and make decision-making more productive, while always holding the person accountable at the end of the day. That’s the goal of all the policies that have been put in place, regardless of how autonomy evolves over the next five or 10 years.

During a conflict, it can be tempting not to wait for human response when targets appear in the blink of an eye, especially with weapons like autonomous fighters.

The autonomous program we are working on for the Fury aircraft [a fighter used by the US Navy and Marine Corps] it’s called CCA, Collaborative Combat Aircraft. There’s a man in the plane controlling and commanding the fighter robots and deciding what they do.

What about the drones you build that hover until they see a target, and then they jump on it?

There’s a classification of drones called loitering munitions, which are aircraft that seek out targets and then have the ability to kinetically attack those targets, kind of like a kamikaze. Again, you have a human in the loop who’s responsible.

War is confused. Isn’t there a genuine fear that these principles will be set aside once hostilities begin?

Humans fight wars, and humans are imperfect. We make mistakes. Even when we were standing in lines and shooting each other with muskets, there was a process for adjudicating violations of the law of engagement. I think it will continue. Do I think there will never be a case where some autonomous system is asked to do something that seems like a blatant ethical violation? Of course not, because humans are still in charge. Do I think it is more ethical to prosecute risky, confused conflict with robots who are more precise, more selective, and less likely to escalate? Yes. Deciding NO Doing so means further putting people at risk.

Photo: Peyton Fulford

I’m sure you’ve heard Eisenhower’s latest message about the dangers of a military-industrial complicated that serves its own needs. Does that warning affect the way you operate?

It’s one of the best speeches of all time—I read it at least once a year. Eisenhower was talking about the military-industrial complicated, where government isn’t all that different from contractors like Lockheed Martin, Boeing, Northrop Grumman, General Dynamics. There’s a revolving door at the top of these companies, and they become power centers because of this interconnectedness. Anduril is promoting a more commercial approach that doesn’t rely on this tightly coupled incentive structure. We’re saying, “Let’s build things at the lowest cost, using off-the-shelf technology, and let’s do it in a way that takes a lot of risk.” That avoids some of the potential tensions that Eisenhower identified.

Latest Posts

More News