This browser lets you use AI locally on your phone, even offline - here's how
Briefly

This browser lets you use AI locally on your phone, even offline - here's how
"Most often, that comes in the form of Ollama installed on a desktop or laptop. I have been leery of using cloud-based AI for some time now for several reasons: It consumes vast amounts of energy. There's no way to be certain it honors privacy claims. I don't want any of my queries or data being used to train LLMs."
"Given I've used local AI plenty of times (on various hardware), I know how it can be a drain on system resources, so I assumed local AI on a phone would be dreadfully slow. I also know how much storage those LLMs can take up, so I downloaded the LLM with a light bit of trepidation (unsure if uninstalling Puma Browser would also remove the LLM)."
"Keep in mind that local AI on Puma Browser is still in the experimental phase, so it's going to have issues. As well, downloading an LLM will take some time. You should also make sure that you are on a wireless network; otherwise, the download will gobble up your data plan and will take forever. Even on wireless, the download of Qwen 3 1.5b took over 10 minutes."
Puma Browser brings Local AI to mobile devices by allowing users to download and run several on-device LLMs such as Qwen 3 1.5b, Qwen 3 4B, LFM2 variants, and Google Gemma 3n E2B. Local AI use is preferred for lower energy impact, stronger privacy control, and preventing query data from being used to train cloud models. Downloading models requires time and substantial storage, and downloads should be done on Wi‑Fi to avoid data overages. The Local AI feature is experimental and may present issues, but initial testing on a Pixel 9 Pro showed unexpectedly good performance despite resource concerns.
Read at ZDNET
Unable to calculate read time
[
|
]