Airline's Chatbot Lies About Bereavement Policy After Passenger's Grandmother Dies
Briefly

We kid you not, that's exactly what happened to a Canadian man named Jake Moffatt, when his grandmother died... leaving you on the hook for hundreds of dollars you didn't anticipate.
Companies are rushing to deploy chatbots to customer interactions, even if they're not ready for prime time. The Air Canada chatbot is a particularly egregious example.
Read at Futurism
[
add
]
[
|
|
]