Anthropic Refuses to Bend to Pentagon on AI Safeguards
Digest more
Pentagon, Anthropic and AI
Digest more
A senior official in the Department of Defense accused Anthropic of "lying" about how the U.S. military intends to use the private tech firm's Claude AI system.
The Pentagon may decide to officially designate Anthropic as a "supply chain risk" to push them out of government, sources say.
The Pentagon previously requested Anthropic, OpenAI, Google, and xAI allow the use of their AI models for “all lawful purposes,” to which Anthropic put up the most resistance over fears its AI models could be used for autonomous weapons systems and mass domestic surveillance.
Anthropic AI defies Pentagon over expanded military use of its tech despite Hegseth blacklist threat
As well as designating Anthropic as a supply chain risk, the government could also cancel its contract or invoke a Cold War-era law called the Defense Production Act to give the military more sweeping authority to use its products, even if the company doesn’t approve.
Anthropic, the AI company behind the Claude chatbot that was founded with a focus on safe technology, appears to be scaling back its safety commitments in order to keep the company competitive, after it amended a set of self-imposed guidelines aimed at preventing the development of AI that could potentially be dangerous.
A hacker exploited Anthropic PBC’s artificial intelligence chatbot to carry out a series of attacks against Mexican government agencies, resulting in the theft of a huge trove of sensitive tax and voter information,
In January, Anthropic “retired” Claude 3 Opus, which at one time was the company’s most powerful AI model. Today, it’s back — and writing on Substack.
CEO Dario Amodei rejected the Pentagon's ultimatum and said "we cannot in good conscience accede to their request."
The US government has given the AI company behind Claude until 5pm to drop its restrictions on military use. The company says it won't. But it has already given ground elsewhere If you have heard of Anthropic,