Medieval theology has an old take on a new problem AI responsibility
Briefly

A self-driving taxi, designed with the purpose of reducing congestion and air pollution, raises moral dilemmas regarding accountability in the event of accidents.
Developers of AI systems may not predict an autonomous vehicle's actions. People expect AI to innovate, complicating blame when accidents occur.
Self-driving taxis are deterministic, driven by code and sensor inputs, making it difficult to moralize their actions, as they lack free will.
There exists a 'responsibility gap' where traditional moral frameworks do not suffice to capture the complexities involving autonomous AI systems and their actions.
Read at The Conversation
[
]
[
|
]