When questioned about its implications, the AI assistant from Boox, powered by ByteDance's Doubao, propagated specific narratives that align with the Chinese government's views.
Users highlighted discrepancies when the AI assistant refused to acknowledge events like the Tiananmen Square crackdown, illustrating a clear bias in the model's responses.
The incident raised concerns over the ethical implications of utilizing AI models designed for one region in products used in others, particularly in relation to misinformation.
Collection
[
|
...
]