
"So far, the accomplishments of these so-called AI scientists have been mixed. On the one hand, AI systems can process vast datasets and detect subtle correlations that humans are unable to detect. On the other hand, their lack of commonsense reasoning can result in unrealistic or irrelevant experimental recommendations. While AI can assist in tasks that are part of the scientific process, it is still far away from automating science-and may never be able to."
"AI models do not learn directly from the real world: They have to be "told" what the world is like by their human designers. Without human scientists overseeing the construction of the digital "world" in which the model operates-that is, the datasets used for training and testing its algorithms-the breakthroughs that AI facilitates wouldn't be possible. Consider the AI model AlphaFold. Its developers were awarded the 2024 Nobel Prize in chemistry for the model's ability to infer the structure of proteins in human cells."
AI systems can process vast datasets and detect subtle correlations beyond human capacity, enabling faster hypothesis generation and simulation. AI lacks commonsense reasoning, which can produce unrealistic, irrelevant experimental recommendations. AI models learn only from human-designed datasets and algorithms, requiring human scientists to construct, curate, and oversee the digital training environment. High-profile successes, such as AlphaFold's protein-structure inference, depended on human expertise and data curation. Government initiatives like the Genesis Mission aim to train AI on federal scientific datasets to automate workflows and accelerate breakthroughs. Core scientific tasks—conceptual understanding, experiment design, interpretation, and methodological judgment—remain primarily human responsibilities.
Read at Fast Company
Unable to calculate read time
Collection
[
|
...
]