
"OpenAI recently launched two open-weight models named gpt-oss, allowing users to download and run them locally, with gpt-oss-20b being the lighter model at 21 billion parameters."
"gpt-oss-20b requires around 16GB of free memory to operate, while the heavier gpt-oss-120b needs 80GB, highlighting the varying hardware needs for running these models."
"For users without high-end AI servers, gpt-oss-20b is more accessible, requiring only a GPU with at least 16GB of VRAM or 24GB system memory."
"To utilize gpt-oss-20b on Windows, download the Ollama app, which facilitates an easy setup and allows interaction with the language model via prompts."
OpenAI launched two open-weight models named gpt-oss for local use. The lighter model, gpt-oss-20b, has 21 billion parameters and needs about 16GB of memory. The heavier gpt-oss-120b requires 80GB of memory to run successfully. Running gpt-oss-20b is feasible for many users with appropriate hardware, needing a GPU with 16GB VRAM or 24GB system memory. To use gpt-oss-20b on Windows, users must download the Ollama app, which simplifies the process of downloading and executing the model with user prompts.
Read at Theregister
Unable to calculate read time
Collection
[
|
...
]