
"Because AI is a subset of a very broad HPC (High Performance Computing) category of workloads. First, what is HPC? It's not an application; it's a loose term that covers apps and workflows many domains from financial services to pharma to manufacturing to a whole bunch of others. These are workloads that are demanding enough and important enough to justify the investment of time and money to run."
"Both AI (including ML, inference, and generative) and traditional HPC workflows are complex and computationally intensive. Garden variety systems and infrastructures simply can't handle the time to solution and accuracy of solution requirements posed by HPC or AI workflows. A high-performance infrastructure is critical to both types of workloads. While you could certainly run OpenFOAM or train an LLM on a laptop, the time to solution would be so long that the results wouldn't be relevant (or you'll be retired) when you finally get them."
AI is a subset of broader high-performance computing (HPC) workloads. HPC describes demanding applications and workflows across many domains such as financial services, pharma, and manufacturing that justify significant time and investment. Both AI (including ML, inference, and generative) and traditional HPC workflows are complex and computationally intensive, requiring high-performance infrastructure to meet time-to-solution and accuracy requirements. Running these workloads on low-power or consumer hardware yields impractically long runtimes. Both domains pursue ever greater scale and accuracy, such as evaluating more compounds in drug discovery, adding more training data, or expanding model parameters.
Read at Theregister
Unable to calculate read time
Collection
[
|
...
]