A recent survey by the California Department of Technology revealed that while algorithms are used to inform decisions in areas like criminal justice and employment benefits, state agencies reported no use of high-risk automated systems. This follows legislation requiring annual reporting on systems that impact personal lives. Chief Technology Officer Jonathan Porat highlighted the challenge of tracking such technology, as agencies self-report system usage. Current practices raise concerns about algorithm definitions, potential biases, and the efficacy of state oversight in ensuring fairness and transparency in decision-making processes related to housing, education, and health care.
The California Department of Technology reported that state agencies do not use high-risk automated decision-making technology, despite evidence suggesting its widespread application in various services.
State Chief Technology Officer Jonathan Porat emphasized the challenges in tracking algorithm usage across agencies, stating reliance on their self-reported data and interpretations of what constitutes 'high-risk'.
Legislation passed in 2023 mandates annual reports from state agencies regarding high-risk automated systems but results show a lack of transparency in algorithm deployment.
The complexities of automated systems raise important questions about risk definitions and impacts in sectors like corrections, which could lead to biased outcomes.
#california #automated-decision-making #algorithm-transparency #recidivism-scores #bias-in-technology
Collection
[
|
...
]