
"On those rare occasions when I use AI, I always opt for a local version. Most often, that comes in the form of Ollama installed on a desktop or laptop. I've been leery of using cloud-based AI for some time now for several reasons: It consumes vast amounts of energy. There's no way to be certain it honors privacy claims. I don't want any of my queries or data to be used for training LLMs."
"Please note that local AI on Puma Browser is still in the experimental phase, so it may encounter issues. As well, downloading an LLM will take some time. You should also ensure that you are connected to a wireless network; otherwise, the download will consume your data plan and take a long time. Even on wireless, the download of Qwen 3 1.5b took over 10 minutes."
Puma Browser is a free AI-centric mobile browser for Android that supports on-device Local AI by enabling users to download and run various LLMs directly on a phone. Supported models include Qwen 3 1.5b and 4B, LFM2 1.2 and 700M, and Google Gemma 3n E2B. Model downloads can be large and slow; downloading Qwen 3 1.5b took over 10 minutes on Wi‑Fi. Local LLMs can strain system resources and consume significant storage. Local AI support in Puma Browser is experimental and may present issues. Users are advised to use Wi‑Fi to avoid mobile data overages.
Read at ZDNET
Unable to calculate read time
Collection
[
|
...
]