How to turn Ollama from a terminal tool into a browser-based AI with this free extension
Briefly

The article discusses the preference for using local LLMs, particularly Ollama, over remote alternatives due to safety concerns. It outlines installation procedures for Ollama on various operating systems and introduces the Page Assist Firefox extension that enhances user experience with the local LLM. The usage instructions for the extension are provided, emphasizing how to access and utilize Ollama efficiently. The author's overall stance promotes the benefits of running LLMs locally for improved control and convenience during AI-assisted research.
Using Ollama from within the terminal window is actually quite easy, but it doesn't give you such obvious access to other features, such as LLM/Prompt selection, image upload, internet search enable/disable, and Settings.
To make this work, you'll need Ollama installed and running, as well as the Firefox browser. That's it. Let's make some magic.
If you've already installed it, you can run a local LLM with a command like this: if you see the >>> prompt, the LLM is running and ready to accept queries.
I opt for a local LLM, such as Ollama, because querying a remote LLM is not appealing to me.
Read at ZDNET
[
|
]