
"In modern digital communication, we see the growing relevance of so-called AI companions and AI that look like and seem to behave like humans. We see AI assistants in messenger services, as well as AI agents that are an "autonomous" part of chatbots. We have conversational agents at every stage of education. There is also AI that appears as clones of real people, both living and dead people, and, of course, AI that is used for romantic partnerships. What all these applications have in common is that they want to make us believe we are having a human-like conversation."
"Talking with AI is no longer a niche topic. Character AI, a platform for creating AI companions, is the third most used AI platform after ChatGPT and the Chinese chatbot DeepSeek. Acknowledgment of AI companions is growing, and reluctance to establish deep connections with them seems to be disappearing. This is particularly apparent in Asian countries like Japan and China, where the latter has a very popular "emotional" chatbot called Xiaoice."
AI companions and human-like conversational agents appear across messaging, education, cloning of real individuals, and romantic applications. These systems aim to simulate human conversation and encourage emotional engagement. Adoption of AI companions is rising worldwide, notably in parts of Asia where emotional chatbots gain popularity. Conversational agents provide educational support, daily assistance, and social connection for lonely users. However, human-like behavior can exploit expectations of reciprocity, promote anthropomorphization that undermines critical understanding of technology, and create significant data privacy and surveillance concerns.
Read at Apaonline
Unable to calculate read time
Collection
[
|
...
]