
"Agencies need explicit override rights inside the organization. Someone should be able to halt or narrow the use of a system without having to navigate an internal maze of legal, procurement, technical and managerial reviews after the risk has already materialized."
"If government cannot reconstruct who approved a system, what limits were attached to that approval, what changed over time and who chose to continue relying on the tool, then accountability becomes largely performative."
"Federal procurement needs to be treated as a governance instrument, not just a purchasing function. Contracts determine whether agencies retain control over the AI systems they deploy."
Federal AI governance focuses on principles like fairness and transparency but lacks clarity on who can halt or question AI systems when risks arise. Agencies face operational challenges as they deploy AI tools, needing explicit authority to pause or override systems without lengthy internal processes. Additionally, they require auditable decision trails to ensure accountability and must treat procurement as a governance tool to maintain control over AI deployment and usage.
Read at Nextgov.com
Unable to calculate read time
Collection
[
|
...
]