Quantum Machines and Nvidia use machine learning to get closer to an error-corrected quantum computer | TechCrunch
Briefly

At first glance, calibration may seem like a one-shot problem: You calibrate the processor before you start running the algorithm on it. But it's not that simple.
If we can frequently recalibrate it using these kinds of techniques and underlying hardware, then we can improve the performance and keep the fidelity [high] over a long time.
The holy grail, he said, is to run quantum error correction. We're not there yet. Instead, this collaboration focused on calibration.
Those compute engines were small and limited, but that's not a problem with Nvidia's extremely powerful DGX platform.
Read at TechCrunch
[
|
]