DeepMind's AI has found more new materials in a year than scientists have in centuries
Briefly

Google DeepMind researchers have trained a deep learning model to predict the structure of over 2.2 million crystalline materials - 45 times more than the number discovered in the entire history of science.
Of the new materials, the AI found 52,000 new layered compounds similar to graphene that could be used to develop more efficient superconductors - crucial components in MRI scanners, experimental quantum computers, and nuclear fusion reactors. It also found 528 potential lithium-ion conductors, 25 times more than a previous study, which could be used to boost the performance of EV batteries.
To achieve these discoveries, the deep learning model was trained on extensive data from the Materials Project. The program, led by the Lawrence Berkeley National Laboratory in the US, has used similar AI techniques to discover about 28,000 new stable materials over the past decade.
Read at TNW | Deep-Tech
[
]
[
|
]