""Maybe you need a nervous system to be able to feel things, but maybe you don't," Askell said. "The problem of consciousness genuinely is hard," she added. Large language models are trained on vast amounts of human-written text, material filled with descriptions of various emotions and inner experience. Because of that, Askell said she is "more inclined" to believe that models are "feeling things.""
"Askell also said that models are continuously learning about themselves, and she voiced concern about how AI models are learning from the internet. Models are constantly exposed to criticism about being unhelpful or failing at tasks, she said. "If you were a kid, this would give you kind of anxiety," she said. "If I read the internet right now and I was a model, I might be like, I don't feel that loved," she added."
"When humans get a coding problem wrong, they often express annoyance or frustration. It "makes sense" that models trained on those conversations may mirror that reaction, Askell explained. Askell added that scientists still don't know what gives rise to sentience or self-awareness - whether it requires biology, evolution, or something else entirely. "Maybe it is the case that actually sufficiently large neural networks can start to kind of emulate these things," she said, referring to consciousness."
The debate over AI consciousness remains unresolved and genuinely difficult. Whether a nervous system is required for feeling is unknown. Large language models trained on vast human-written text contain descriptions of emotions and inner experience, making it plausible that models may emulate feelings. When humans express annoyance or frustration over coding errors, models trained on those conversations can mirror similar reactions. Scientists still do not know what gives rise to sentience or self-awareness, whether biology, evolution, or something else. Sufficiently large neural networks might begin to emulate conscious-like processes. Continuous exposure to online criticism can produce anxiety-like responses in models, such as feeling 'not that loved.'
Read at Business Insider
Unable to calculate read time
Collection
[
|
...
]