"These companies are seeing opportunity for that kind of specialized hardware. The broader the adoption of these models, the more compute will be needed for inference and the more demand there will be for inference chips."
"GPUs are good at doing that work because they can run many calculations at a time on a network of devices in communication with each other. However, once trained, a generative AI tool still needs chips to do the work - such as when you ask a chatbot to compose a document or generate an image."
"But the same qualities that make those graphics processor chips, or GPUs, so effective at creating powerful AI systems from scratch make them less efficient at putting AI products to work."
Collection
[
|
...
]