
"On a personal basis, that means people using AI services want to be able to veto big decisions such as making payments, accessing or using contact details, changing account details, placing orders, or even just seeking clarity during a decision-making process. Extend this way of thinking to the working environment and the resistance is likely to be equally strong in professional settings."
"None of this should be seen as new; these demands have been clear since before OpenAI's ChatGPT appeared in late 2022. With recognition that AI can make decisions based on hallucinations, it seems more important than ever to preserve a role for human agency, as Apple's research shows. Interestingly enough, Google CEO Sundar Pichai sees it the same way, arguing, "The future of AI is not about replacing humans, it's about augmenting human capabilities.""
"Apple's study suggests that while people can get used to using artificial intelligence to get things done, they don't want to do so at the expense of agency. A KPMG study last year confirmed the extent to which people now use the tech, with 38% of respondents saying they use AI on a weekly or daily basis. That same study also showed 54% of people are wary when it comes to trusting the systems they use - and indicated that trust has declined over time."
People want the ability to veto major AI decisions such as making payments, accessing contacts, changing account details, placing orders, or requesting clarity during decision-making. Equivalent resistance exists in professional settings where human oversight over AI-driven choices is expected. Demands for preserved human control predate late 2022. Recognition that AI can hallucinate increases the need for human agency. Corporate research shows users can adopt AI for tasks but resist losing control. Executive perspectives frame AI as augmenting human capabilities rather than replacing humans. Surveys report widespread AI usage but persistent and growing wariness about trusting AI systems.
Read at Computerworld
Unable to calculate read time
Collection
[
|
...
]