Is Tree-based RAG Struggling? Not with Knowledge Graphs!
Diffbot Diffbot
2.35K subscribers
21,526 views
0

 Published On Mar 25, 2024

Long-Context models such as Google Gemini Pro 1.5 or Large World Model are probably changing the way we think about RAG (retrieval-augmented generation). Some are starting to explore the potential application of “Long-Context RAG”. One example is RAPTOR (Recursive Abstractive Processing for Tree-Organized Retrieval), by clustering and summarizing documents, this method lets language models grasp both general concepts and granular details in the individual documents. Inspired by LangChain, we tested out constructing a tree-based long context RAG. Watch the video to find out whether this approach allows us to say goodbye to the lost-in-the-middle effect commonly seen in large language models. AND, most importantly, how knowledge graphs can come to the rescue to enhance the answer quality. 😉

0:00 Coming Up
0:30 Intro
1:41 What is RAPTOR?
2:35 Knowledge base for RAG
3:13 Constructing a Knowledge Graph with Diffbot API for RAG
3:59 Construct tree-based RAG with RAPTOR
4:22 Test questions on tree-based RAG!
6:45 Reference our knowledge graph for answer enhancement
7:11 Are answers enhanced by our knowledge graph?
8:25 What are your takeaways?

RAPTOR Paper: https://arxiv.org/abs/2401.18059v1
code: https://github.com/leannchen86/raptor...

Want to turn your unstructured text data into knowledge graphs?
Sign up for a 2-week free trial at https://app.diffbot.com/get-started/ to extract entities and relationships from your custom text data. No credit card needed.

show more

Share/Embed