Found 356 bookmarks
Custom sorting
Language, Graphs, and AI in Industry
Language, Graphs, and AI in Industry
Here's the video for my talk @ K1st World Symposium 2023 about the intersections of KGs and LLMs: https://lnkd.in/gugB8Yjj and also the slides, plus related…
Language, Graphs, and AI in Industry
Ā·linkedin.comĀ·
Language, Graphs, and AI in Industry
knowledge graph based RAG (retrieval-augmentation) consistently improves language model accuracy, this time in biomedical questions
knowledge graph based RAG (retrieval-augmentation) consistently improves language model accuracy, this time in biomedical questions
The evidence for the massive impact of KGs in NLQ keeps piling up - Here's one more paper that shows that knowledge graph based RAG (retrieval-augmentation)…
knowledge graph based RAG (retrieval-augmentation) consistently improves language model accuracy, this time in biomedical questions
Ā·linkedin.comĀ·
knowledge graph based RAG (retrieval-augmentation) consistently improves language model accuracy, this time in biomedical questions
RAG patterns with Knowledge Graphs and vector search
RAG patterns with Knowledge Graphs and vector search
This week Alexander Erdl and I did our last #GoingMeta of the year, and we talked about Advanced #RAG patterns with #KnowledgeGraphs and vector search. We…
#RAG patterns with #KnowledgeGraphs and vector search.
Ā·linkedin.comĀ·
RAG patterns with Knowledge Graphs and vector search
Large Language Models on Graphs: A Comprehensive Survey
Large Language Models on Graphs: A Comprehensive Survey
Large language models (LLMs), such as ChatGPT and LLaMA, are creating significant advancements in natural language processing, due to their strong text encoding/decoding ability and newly found emergent capability (e.g., reasoning). While LLMs are mainly designed to process pure texts, there are many real-world scenarios where text data are associated with rich structure information in the form of graphs (e.g., academic networks, and e-commerce networks) or scenarios where graph data are paired with rich textual information (e.g., molecules with descriptions). Besides, although LLMs have shown their pure text-based reasoning ability, it is underexplored whether such ability can be generalized to graph scenarios (i.e., graph-based reasoning). In this paper, we provide a systematic review of scenarios and techniques related to large language models on graphs. We first summarize potential scenarios of adopting LLMs on graphs into three categories, namely pure graphs, text-rich graphs, and text-paired graphs. We then discuss detailed techniques for utilizing LLMs on graphs, including LLM as Predictor, LLM as Encoder, and LLM as Aligner, and compare the advantages and disadvantages of different schools of models. Furthermore, we mention the real-world applications of such methods and summarize open-source codes and benchmark datasets. Finally, we conclude with potential future research directions in this fast-growing field. The related source can be found at https://github.com/PeterGriffinJin/Awesome-Language-Model-on-Graphs.
Ā·arxiv.orgĀ·
Large Language Models on Graphs: A Comprehensive Survey
Convert your text into an interactive Knowledge Graph
Convert your text into an interactive Knowledge Graph
When reading lengthy or intricate texts, keeping an overview of different dependencies within the context is crucial. Traditionally, humans achieve this through note-taking or mentally creating a concept map. Now imagine having AI at hand which generates such a map for you. Even better, the…
Ā·ai-readiness.chĀ·
Convert your text into an interactive Knowledge Graph
Common sense knowledge graphs are slightly different from conventional knowledge graphs, but they share the most important thing: they both capture explicit symbolic knowledge
Common sense knowledge graphs are slightly different from conventional knowledge graphs, but they share the most important thing: they both capture explicit symbolic knowledge
I really enjoyed the latest #UnconfuseMe with Bill Gates and Yejin Choi.Ā  Yejin's research is on symbolic knowledge distillation, which means they take large…
Common sense knowledge graphs are slightly different from conventional knowledge graphs, but they share the most important thing: they both capture explicit symbolic knowledge
Ā·linkedin.comĀ·
Common sense knowledge graphs are slightly different from conventional knowledge graphs, but they share the most important thing: they both capture explicit symbolic knowledge
Introducing MechGPT: 1) fine-tuning an LLM, and 2) generating a knowledge graph
Introducing MechGPT: 1) fine-tuning an LLM, and 2) generating a knowledge graph
Introducing MechGPT šŸ¦¾šŸ¤– This project by Markus J. Buehler is one of the coolest use cases of 1) fine-tuning an LLM, and 2) generating a knowledge graph that… | 33 comments on LinkedIn
Introducing MechGPT šŸ¦¾šŸ¤–This project by Markus J. Buehler is one of the coolest use cases of 1) fine-tuning an LLM, and 2) generating a knowledge graph that we’ve seen (powered by LlamaIndex
Ā·linkedin.comĀ·
Introducing MechGPT: 1) fine-tuning an LLM, and 2) generating a knowledge graph
Reasoning on Graphs: Faithful and Interpretable Large Language Model Reasoning
Reasoning on Graphs: Faithful and Interpretable Large Language Model Reasoning
Large language models (LLMs) have demonstrated impressive reasoning abilities in complex tasks. However, they lack up-to-date knowledge and experience hallucinations during reasoning, which can lead to incorrect reasoning processes and diminish their performance and trustworthiness. Knowledge graphs (KGs), which capture vast amounts of facts in a structured format, offer a reliable source of knowledge for reasoning. Nevertheless, existing KG-based LLM reasoning methods only treat KGs as factual knowledge bases and overlook the importance of their structural information for reasoning. In this paper, we propose a novel method called reasoning on graphs (RoG) that synergizes LLMs with KGs to enable faithful and interpretable reasoning. Specifically, we present a planning-retrieval-reasoning framework, where RoG first generates relation paths grounded by KGs as faithful plans. These plans are then used to retrieve valid reasoning paths from the KGs for LLMs to conduct faithful reasoning. Furthermore, RoG not only distills knowledge from KGs to improve the reasoning ability of LLMs through training but also allows seamless integration with any arbitrary LLMs during inference. Extensive experiments on two benchmark KGQA datasets demonstrate that RoG achieves state-of-the-art performance on KG reasoning tasks and generates faithful and interpretable reasoning results.
Ā·arxiv.orgĀ·
Reasoning on Graphs: Faithful and Interpretable Large Language Model Reasoning
Working on a LangChain template that adds a custom graph conversational memory to the Neo4j Cypher chain
Working on a LangChain template that adds a custom graph conversational memory to the Neo4j Cypher chain
Working on a LangChain template that adds a custom graph conversational memory to the Neo4j Cypher chain, which uses LLMs to generate Cypher statements. This…
Working on a LangChain template that adds a custom graph conversational memory to the Neo4j Cypher chain
Ā·linkedin.comĀ·
Working on a LangChain template that adds a custom graph conversational memory to the Neo4j Cypher chain
Charting the Graphical Roadmap to Smarter AI
Charting the Graphical Roadmap to Smarter AI
Subscribe • Previous Issues Boosting LLMs with External Knowledge: The Case for Knowledge Graphs When we wrote our post on Graph Intelligence in early 2022, our goal was to highlight techniques for deriving insights about relationships and connections from structured data using graph analytics and machine learning. We focused mainly on business intelligence and machine learning applications, showcasing how technology companies were applying graph neural networks (GNNs) in areas like recommendations and fraud detection.
Ā·gradientflow.substack.comĀ·
Charting the Graphical Roadmap to Smarter AI