• 0 Posts
  • 7 Comments
Joined 1 month ago
cake
Cake day: March 8th, 2025

help-circle
  • I think using LLMs with RAG (aka tools) is more useful and reliable than relying only on training data that the model does its best to represent.

    For example, using a search engine to find results for a query, downloading the first 10 results as text, and then having the LLM answer subsequent queries about those sources, or another example would be uploading a document and having the LLM answer queries about its contents.

    This is also advantageous because much smaller and quicker models can be used while still producing accurate results (often with citations to the source).

    This can even be self hosted with Open WebUI/ollama.