Anthropic use halted in U.S. agencies after Pentagon clash
President Donald Trump orders agencies to stop using Anthropic AI
On Feb. 27, 2026, President Donald Trump ordered all U.S. federal agencies to immediately cease using Anthropic’s AI technology, including the Department of Defense, as reported by CNBC. The directive applies across civilian and national security portfolios and takes effect immediately.
The move follows a high-profile dispute between Anthropic and the Pentagon over contract terms, but the presidential order is broader than any single solicitation. It pauses government use of Anthropic’s AI tools while agencies evaluate compliance and alternatives.
Why it matters: impacts for DoD and federal AI projects
Anthropic has refused Defense Department demands to permit its Claude model to be used for any lawful purpose, citing red lines on mass domestic surveillance and fully autonomous weapons, according to the Associated Press. The Pentagon has publicly asserted no interest in illegal surveillance or weapons use without human oversight, but the parties have not reached mutually acceptable contract language.
In the near term, agencies that integrated Claude into pilots or analytic workflows will likely pause access, reassess risk, and consider migration paths. Program leaders may need to update model governance artifacts, data handling procedures, and procurement files to reflect the order.
Anthropic’s leadership has framed its stance as a safety boundary rather than a negotiating tactic. “We cannot in good conscience accede,” said Dario Amodei, CEO of Anthropic.
The legal dimension is also material. Defense Secretary Pete Hegseth has raised the possibility of labeling Anthropic a supply chain risk or invoking the Defense Production Act by Friday, Feb. 27, 2026, as reported by the Financial Times. If pursued, those steps could reshape federal AI vendor pools and delay active procurements pending legal review.
At the time of this writing, Amazon.com, Inc. (AMZN) traded at 208.51, up 0.28%, based on data from Yahoo. This market figure is provided for context only and does not imply investment guidance.
Defense Production Act, mass surveillance, and autonomous weapons: key questions
Can the Defense Production Act compel changes to Anthropic’s guardrails?
Any attempt to use the DPA to alter private-sector AI safety guardrails would test the statute’s application to software governance rather than production capacity or prioritization. Even if invoked, downstream effects would still need to align with constitutional constraints and procurement law.
Concerns about procurement integrity and how ethical constraints are expressed in solicitations have been flagged by George Washington University Law School’s Jessica Tillipman. Adjusting usage conditions midstream can raise bid-protest exposure and contract administration challenges.
How do mass surveillance and autonomous weapons map to government use?
Civil liberties groups warn that broad authorizations risk implicating Fourth Amendment limits on government searches and the constitutional framework for use of lethal force, according to Protect Democracy. Those concerns intensify when AI enables scale, speed, or opacity beyond traditional tools.
The core tension is definitional as much as technical: neither mass surveillance nor autonomous weapons has a single, controlling federal definition that clearly resolves the boundaries for deployment. In practice, agencies rely on policy, oversight, and human-in-the-loop controls, but the adequacy of those safeguards remains contested.
| Disclaimer The information provided in this article is for educational and informational purposes only and does not constitute financial, investment, or legal advice. Cryptocurrency and blockchain markets are volatile, always do your own research (DYOR) before making any financial decisions. While TokenTopNews.com strives for accuracy and reliability, we do not guarantee the completeness or timeliness of any information provided. Some articles may include AI-assisted content, but all posts are reviewed and edited by human editors to ensure accuracy, transparency, and compliance with Google’s content quality standards. The opinions expressed are those of the author and do not necessarily reflect the views of TokenTopNews.com. TokenTopNews.com is not responsible for any financial losses resulting from reliance on information found on this site. |

