Buying a PC for local AI? These are the specs that matter
Briefly

The kind of hardware you need to run AI models locally depends a lot on what exactly you're trying to achieve. As much as you might like to train a custom model at home, realistically, that's not going to happen. The types of generative AI workloads within reach of the average consumer usually fall into one of two categories: image generation and large language model (LLM)-based services like chatbots, summarization engines, or code assistants.
If you want to run a 70 billion parameter model, you're going to need a rather beefy system, potentially with multiple high-end graphics cards. But a more modest eight-billion-parameter model is certainly something you can get running on a reasonably modern notebook or entry-level GPU. If you're interested in more advanced use cases for AI, like fine-tuning, or developing applications that stitch together the capabilities of multiple models, you may even need to start looking at workstation or server class hardware.
With our expectations set accordingly, let's dig into the specs that make the biggest difference when running generative AI models at home: Memory / vRAM capacity. Without question, the most important stat to look for is memory capacity, or more specifically, GPU vRAM, if you're looking for a system with dedic.
Read at Theregister
[
]
[
|
]