
"One day, after hearing one such cry, I tried to figure out how I would describe it to someone who never heard it. Imagine you have a piece of a bamboo trunk cut into two halves. The borders of the two halves have been covered with small indentations, creating a finely toothed ridge. Now imagine you slide one of those ridges against the other, very fast. The resulting sound would, I believe, resemble the cry of toucans."
"Since large language models (LLMs) were introduced to the public in 2023, claims about such models being conscious have proliferated. By "conscious," what is usually meant is the same as having a subjective experience-that it feels like something to be an LLM. In one study, more than half of participants who used LLMs attributed a non-zero chance of them being conscious."
A description compares a toucan cry to the sound produced by sliding two finely toothed halves of a bamboo trunk rapidly against each other, and raises the question of knowing experiences never directly performed. Since 2023, claims that large language models (LLMs) are conscious have proliferated. Consciousness is treated as having subjective experience—what it feels like to be an LLM. Surveys found more than half of LLM users assigned a non-zero probability that models are conscious. Engineers and researchers have sometimes endorsed such beliefs, exemplified by Blake Lemoine's firing and Geoffrey Hinton's public statements. AI companies have leveraged consciousness attributions in marketing.
Read at Apaonline
Unable to calculate read time
Collection
[
|
...
]