Nvidia's new GPU definition could mean pricier AI software
Briefly

At the recent GPU Technology Conference, Nvidia changed its definition of GPUs, now counting GPU dies as individual GPUs rather than entire SXM modules. This shift simplifies naming conventions but significantly affects pricing for its AI Enterprise licenses, potentially doubling costs. The new HGX B300 can now be priced at $72,000 annually due to its increased GPU count, even with only modest performance improvements over the previous B200 model. This strategic move raises costs for users dependent on Nvidia's AI solutions while balancing technological enhancements with financial implications.
At its GPU Technology Conference last month, Nvidia broke with convention by shifting its definition of what counts as a GPU.
However, Nvidia's shift to counting GPU dies, rather than SXM modules, as individual GPUs doesn't just simplify NVLink model numbers and naming conventions.
Nvidia's AI Enterprise suite, which covers a host of AI frameworks including access to its inference microservices (NIMs), would run you $4,500 a year or $1 an hour in the cloud, per GPU.
With the new HGX B300 NVL16, Nvidia is now counting each die as a GPU.
Read at Theregister
[
|
]