
""For instance, if an AI system unfairly denies credit to a customer in urgent need - such as for medical treatment - there must be clarity on who is responsible: the developers, the institution deploying the model, or the data providers.""
""Over a year since the regime was established, it is not clear to us why HM Treasury has been so slow to use the new powers at its disposal. The Bank of England's Financial Policy Committee must monitor the regime's progres""
""individuals within financial services firms were "on the hook" for harm caused to consumers through AI.""
""the "lack of explainability" of AI models directly conflicted with the regime's requirement for senior managers to demonstrate they understood and controlled risks""
UK financial regulators are urged to conduct stress testing to ensure businesses are ready for AI-driven market shocks. The Bank of England, Financial Conduct Authority and HM Treasury risk exposing consumers and the financial system to "potentially serious harm" by adopting a wait-and-see approach. Hearings revealed a lack of accountability and limited understanding of AI risks across financial services. Individuals within firms remain responsible for consumer harm, yet senior managers struggle to assess opaque AI models. Clear accountability is required for harmful AI outcomes, and the Critical Third Parties regime needs urgent implementation and monitoring by the Bank of England's Financial Policy Committee.
Read at Theregister
Unable to calculate read time
Collection
[
|
...
]