Saturday, April 25, 2026

Anthropic supply chain risk determination suspended by judge

Share

Anthropic won A a preliminary injunction prohibiting the U.S. Department of Defense from considering it a supply chain risk, potentially clearing the way for customers to resume doing business with the company. Thursday’s ruling by Rita Lin, a federal district judge in San Francisco, is a symbolic setback for the Pentagon and a significant boost for the generative artificial intelligence company as it struggles to preserve its business and reputation.

“Defendants’ characterization of Anthropic as a ‘supply chain risk’ is arguably both unlawful and arbitrary and capricious” – Lin he wrote in justifying the short-lived dismissal. “The Department of War provides no reasonable basis to suggest that Anthropic’s direct insistence on use restrictions would constitute a saboteur.”

Anthropic and Pentagon did not immediately respond to requests for comment on the ruling.

The Department of Defense, which is called the Department of War under Trump, has used Anthropic’s Claude AI tools over the past few years to create sensitive documents and analyze classified data. However, this month he began to disconnect Claude with the determination that Anthropic could not be trusted. Pentagon officials cited numerous instances in which Anthropic allegedly imposed or attempted to impose operate restrictions on its technology that the Trump administration deemed unnecessary.

Ultimately, the administration issued several directives, including designating the company as a supply chain risk, which resulted in the federal government slowly halting Claude’s operate and harming Anthropic’s sales and public reputation. The company filed two lawsuits challenging the sanctions as unconstitutional. During Tuesday’s hearing, Lin said the government was apparently illegally “crippling” and “punishing” Anthropic.

Lin’s Thursday ruling “restores the status quo” to what it was on Feb. 27, before the directives were issued. “It does not prohibit the defendant from taking any lawful action that would be available to him,” she wrote that day. “For example, this order does not require the Department of War to use Anthropic products or services and does not prevent the Department of War from switching to other artificial intelligence providers so long as those activities are consistent with applicable laws, statutes, and constitutional provisions.”

The ruling suggests that the Pentagon and other federal agencies still have the discretion to cancel contracts with Anthropic and ask contractors that integrate Claude into their own tools to stop doing so, citing the supply chain risk designation as a basis.

The immediate impact is unclear as Lin’s executive order won’t take effect for another week. A federal appeals court in Washington has not yet ruled on a second lawsuit filed by Anthropic, which involves a different law that also barred the company from providing software to the military.

But Anthropic could operate Lin’s ruling to show some customers concerned about dealing with an industry pariah that the law may be on its side in the long run. Lin did not set a timetable for a final ruling.

Latest Posts

More News