
"In his work Reflections on Language, published in 1975, Noam Chomsky argued that children learn to speak not just by imitating what they hear, but by constructing a theory from sparse and disordered information. According to the linguist, a child infers knowledge that goes far beyond what they have heard, allowing them to produce new sentences with no direct connection to previous experiences. In other words, children do not merely reproduce patterns they create original knowledge."
"In their academic essay Theory Is All You Need: AI, Human Cognition, and Causal Reasoning, published in late 2024 in the journal Strategy Science, the authors describe AI language generation as backward-looking and imitative, whereas human cognition is forward-looking and capable of generating genuine novelty. There are researchers who have analyzed how babies process their environment. And it turns out that they not only absorb data, but they are constantly making conjectures or formulating hypotheses, explains co-author Felin, 52, via video call from Utah. If I drop my cup on the table, I learn something about the world around me. And it turns out that's precisely the crux of the matter: the ability to formulate conjectures, to want to experiment, or to formulate hypotheses."
"The researcher, who is also the founder of the Institute for Interdisciplinary Studies at Utah State University, states that one of his goals is to debunk all the hype surrounding AI and highlight how the human mind is unique in its causal and theoretical reasoning. The study emphasizes how the mind is not merely an information processor and that humans not only predict the world but also intervene in it and transform it. This, according to the authors, dismantles the mind-machine analogy."
Children acquire language by constructing theories from sparse, disordered input and infer knowledge that exceeds their direct experiences. Human cognition is forward-looking, formulating hypotheses, experimenting, and generating genuine novelty rather than merely reproducing patterns. AI language generation mainly imitates past data and predicts backward from observed examples. Infants constantly make conjectures about their environment, learning by intervening and testing causal relationships. The human mind engages in causal and theoretical reasoning, intervening in and transforming the world rather than acting only as an information processor. This theoretical, interventionist capacity differentiates human learning and reasoning from current AI systems.
Read at english.elpais.com
Unable to calculate read time
Collection
[
|
...
]