The Pentagon's top technology official has criticized Anthropic for imposing restrictions on how its chatbot Claude can be used in military systems.
On Friday, during an appearance on the "All-In" podcast, Emil Michael, the Pentagon's chief technology officer, said, "I need a reliable, steady partner that gives me something that'll work with me on autonomous."
Michael, who is also the defense undersecretary, added, "I need someone who's not going to wig out in the middle."
The clash between the U.S. Department of War and Anthropic has escalated after the Pentagon designated the San Francisco-based AI firm a supply chain risk.
The dispute centers on Anthropic's policies limiting how its AI model Claude can be used. The company has said it does not want its technology deployed for fully autonomous weapons or mass surveillance of Americans.
However, Pentagon officials argue such restrictions could hinder the military's ability to deploy AI systems in future conflicts.
AI's Role In Future Warfare And Missile Defense
Michael said the U.S. military is increasingly exploring AI-driven capabilities, including autonomous drone swarms, underwater vehicles, and automated defense systems, as rivals such as China invest heavily in similar technologies.
He also pointed to potential AI use in the Golden Domemissile defense program, an initiative backed by President Donald Trump that aims to deploy space-based defenses against advanced missile threats.
In a hypothetical scenario involving a Chinese hypersonic missile, Michael said the U.S. could have less than 90 seconds to respond, leaving little time for human decision-making.
He argued that autonomous responses could sometimes present a lower operational risk.
Anthropic Pushes Back, Legal Fight Looms
Anthropic has disputed parts of the Pentagon's account and said its safeguards are narrowly focused on preventing risky uses of AI.
Join thousands of traders who make more informed decisions with our premium features.
Real-time quotes, advanced visualizations, backtesting, and much more.