#document-retrieval

[ follow ]
fromHackernoon
3 years ago

Build a Fully Local RAG System with rlama and Ollama-No Cloud, No Dependencies | HackerNoon

In RAG, a knowledge store is queried to retrieve pertinent documents added to the LLM prompt, helping ground the model's output with factual data.
Roam Research
[ Load more ]