US GSA Draft Guidance Requires AI Companies to Allow 'Any Lawful' Government Use of Their Models
A draft guidance from the US General Services Administration (GSA) tightens rules for civilian AI contracts by requiring AI companies to allow 'any lawful' use by the government of their models — a significant broadening of government access rights over commercially licensed AI systems. The rule, reported by the Financial Times on March 6, 2026, would change the terms under which AI vendors sell to civilian agencies, potentially forcing companies like Anthropic, OpenAI, and others to grant the government sweeping use rights as a condition of federal contracts. The guidance arrives as the Pentagon's separate supply chain risk designation of Anthropic is already reshaping how AI companies approach federal procurement.
Key Takeaways
- US GSA draft guidance for civilian AI contracts would require AI companies to allow 'any lawful' government use of their models — reported by Financial Times, March 6, 2026
- Rule tightens the standard terms AI vendors must accept to win civilian federal AI contracts; broader than current practice which typically has use-case restrictions
- Arrives alongside the Pentagon's Anthropic supply chain risk designation and Dario Amodei's court challenge — together creating a rapidly shifting US government AI procurement landscape
Original source: Financial Times / Techmeme