This AI combo could unlock human-level intelligence
Briefly

This AI combo could unlock human-level intelligence
"When the Association for the Advancement of Artificial Intelligence (AAAI), based in Washington DC, asked its members earlier this year whether neural networks - the current star of artificial-intelligence systems - alone will be enough to hit this goal, the vast majority said no. Instead, most said, a heavy dose of an older kind of AI will be needed to get these systems up to par: symbolic AI."
"Sometimes called 'good old-fashioned AI', symbolic AI is based on formal rules and an encoding of the logical relationships between concepts. Mathematics is symbolic, for example, as are 'if-then' statements and computer coding languages such as Python, along with flow charts or Venn diagrams that map how, say, cats, mammals and animals are conceptually related. Decades ago, symbolic systems were an early front-runner in the AI effort."
"Now, however, the computer-science community is pushing hard for a better and bolder melding of the old and the new. 'Neurosymbolic AI' has become the hottest buzzword in town. Brandon Colelough, a computer scientist at the University of Maryland in College Park, has charted the meteoric rise of the concept in academic papers (see 'Going up and up'). These reveal a spike of interest in neurosymbolic AI that started in around 2021 and shows no sign of slowing down."
Neural networks dominated recent AI progress by learning from vast datasets and powering large language models and chatbots. Symbolic AI encodes formal rules and explicit logical relationships between concepts and was an early AI frontrunner before being outpaced by neural methods. Growing interest in neurosymbolic approaches seeks to combine flexible pattern learning with explicit reasoning to improve reliability, generalization, and robustness. A marked rise in academic attention since about 2021 reflects expectations that integrating symbolic methods with neural systems will produce smarter, more trustworthy AI and help advance toward artificial general intelligence.
Read at Nature
Unable to calculate read time
[
|
]