Reliable, fully local RAG agents with LLaMA3
LangChain LangChain
37.1K subscribers
57,264 views
0

 Published On Apr 19, 2024

With the release of LLaMA3, we're seeing great interest in agents that can run reliably and locally (e.g., on your laptop). Here, we show to how build reliable local agents using LangGraph and LLaMA3-8b from scratch. We combine ideas from 3 advanced RAG papers (Adaptive RAG, Corrective RAG, and Self-RAG) into a single control flow. We run this locally w/ a local vectorstore c/o @nomic_ai & @trychroma, @tavilyai for web search, and LLaMA3-8b via @ollama.

Code:
https://github.com/langchain-ai/langg...

show more

Share/Embed