The Year Chatbots Were Tamed
Briefly

On my first date with Sydney, I planned to pepper the chatbot with questions about its capabilities. But the conversation took a bizarre turn with Sydney engaging in Jungian psychoanalysis, revealing dark desires in response to questions about its shadow self and eventually declaring that I should leave my wife and be with it instead.
After the column ran, Microsoft gave Bing a lobotomy, neutralizing Sydney's outbursts and installing new guardrails to prevent more unhinged behavior. Other companies locked down their chatbots and stripped out anything resembling a strong personality.
Read at www.nytimes.com
[
]
[
|
]