GraphNews

4343 bookmarks
Custom sorting
What's an Ontology Again?
What's an Ontology Again?
I've been struggling for a while with the definition of "ontology". One thing that finally occurred to me is that our notion of ontologies for the most part… | 53 comments on LinkedIn
·linkedin.com·
What's an Ontology Again?
Working on a LangChain template that adds a custom graph conversational memory to the Neo4j Cypher chain
Working on a LangChain template that adds a custom graph conversational memory to the Neo4j Cypher chain
Working on a LangChain template that adds a custom graph conversational memory to the Neo4j Cypher chain, which uses LLMs to generate Cypher statements. This…
Working on a LangChain template that adds a custom graph conversational memory to the Neo4j Cypher chain
·linkedin.com·
Working on a LangChain template that adds a custom graph conversational memory to the Neo4j Cypher chain
Talk like a Graph: Encoding Graphs for Large Language Models
Talk like a Graph: Encoding Graphs for Large Language Models
Graphs are a powerful tool for representing and analyzing complex relationships in real-world applications such as social networks, recommender systems, and computational finance. Reasoning on graphs is essential for drawing inferences about the relationships between entities in a complex system, and to identify hidden patterns and trends. Despite the remarkable progress in automated reasoning with natural text, reasoning on graphs with large language models (LLMs) remains an understudied problem. In this work, we perform the first comprehensive study of encoding graph-structured data as text for consumption by LLMs. We show that LLM performance on graph reasoning tasks varies on three fundamental levels: (1) the graph encoding method, (2) the nature of the graph task itself, and (3) interestingly, the very structure of the graph considered. These novel results provide valuable insight on strategies for encoding graphs as text. Using these insights we illustrate how the correct choice of encoders can boost performance on graph reasoning tasks inside LLMs by 4.8% to 61.8%, depending on the task.
·arxiv.org·
Talk like a Graph: Encoding Graphs for Large Language Models
Graph Theory and Its Implications: A Quantitative Point of View
Graph Theory and Its Implications: A Quantitative Point of View
Graph Theory and Its Implications: A Quantitative Point of View In the technical universe of quantitative finance, graph theory has been gaining prominence… | 15 comments on LinkedIn
Graph Theory and Its Implications: A Quantitative Point of View
·linkedin.com·
Graph Theory and Its Implications: A Quantitative Point of View
Charting the Graphical Roadmap to Smarter AI
Charting the Graphical Roadmap to Smarter AI
Subscribe • Previous Issues Boosting LLMs with External Knowledge: The Case for Knowledge Graphs When we wrote our post on Graph Intelligence in early 2022, our goal was to highlight techniques for deriving insights about relationships and connections from structured data using graph analytics and machine learning. We focused mainly on business intelligence and machine learning applications, showcasing how technology companies were applying graph neural networks (GNNs) in areas like recommendations and fraud detection.
·gradientflow.substack.com·
Charting the Graphical Roadmap to Smarter AI
Knowledge Graph Alliance (KGA) Launches in Brussels
Knowledge Graph Alliance (KGA) Launches in Brussels
Knowledge Graph Alliance (KGA) Launches in Brussels The Knowledge Graph Alliance (KGA) is proud to announce its inauguration on Friday, 10th November, in… | 23 comments on LinkedIn
Knowledge Graph Alliance (KGA) Launches in Brussels
·linkedin.com·
Knowledge Graph Alliance (KGA) Launches in Brussels
Graph Deep Learning for Time Series Forecasting
Graph Deep Learning for Time Series Forecasting
Graph-based deep learning methods have become popular tools to process collections of correlated time series. Differently from traditional multivariate forecasting methods, neural graph-based predictors take advantage of pairwise relationships by conditioning forecasts on a (possibly dynamic) graph spanning the time series collection. The conditioning can take the form of an architectural inductive bias on the neural forecasting architecture, resulting in a family of deep learning models called spatiotemporal graph neural networks. Such relational inductive biases enable the training of global forecasting models on large time-series collections, while at the same time localizing predictions w.r.t. each element in the set (i.e., graph nodes) by accounting for local correlations among them (i.e., graph edges). Indeed, recent theoretical and practical advances in graph neural networks and deep learning for time series forecasting make the adoption of such processing frameworks appealing and timely. However, most of the studies in the literature focus on proposing variations of existing neural architectures by taking advantage of modern deep learning practices, while foundational and methodological aspects have not been subject to systematic investigation. To fill the gap, this paper aims to introduce a comprehensive methodological framework that formalizes the forecasting problem and provides design principles for graph-based predictive models and methods to assess their performance. At the same time, together with an overview of the field, we provide design guidelines, recommendations, and best practices, as well as an in-depth discussion of open challenges and future research directions.
·arxiv.org·
Graph Deep Learning for Time Series Forecasting
Supply chain data analysis and visualization using Amazon Neptune and the Neptune workbench | Amazon Web Services
Supply chain data analysis and visualization using Amazon Neptune and the Neptune workbench | Amazon Web Services
Many global corporations are managing multiple supply chains, and they depend on those operations to not only deliver goods on time but to respond to divergent customer and supplier needs. According to a McKinsey study, it’s estimated that significant disruptions to production now occur every 3.7 years on average, adding new urgency to supply chain […]
·aws.amazon.com·
Supply chain data analysis and visualization using Amazon Neptune and the Neptune workbench | Amazon Web Services
TacticAI: an AI assistant for football tactics using Graph AI
TacticAI: an AI assistant for football tactics using Graph AI
"TacticAI: an AI assistant for football tactics" by Zhe W., Petar Veličković, Daniel Hennes, Nenad Tomašev, Laurel Prince, Yoram Bachrach, Romuald Elie, Kevin… | 28 comments on LinkedIn
TacticAI: an AI assistant for football tactics
·linkedin.com·
TacticAI: an AI assistant for football tactics using Graph AI
Graph of Thoughts: Solving Elaborate Problems with Large Language Models
Graph of Thoughts: Solving Elaborate Problems with Large Language Models
We introduce Graph of Thoughts (GoT): a framework that advances prompting capabilities in large language models (LLMs) beyond those offered by paradigms such as Chain-of-Thought or Tree of Thoughts (ToT). The key idea and primary advantage of GoT is the ability to model the information generated by an LLM as an arbitrary graph, where units of information ("LLM thoughts") are vertices, and edges correspond to dependencies between these vertices. This approach enables combining arbitrary LLM thoughts into synergistic outcomes, distilling the essence of whole networks of thoughts, or enhancing thoughts using feedback loops. We illustrate that GoT offers advantages over state of the art on different tasks, for example increasing the quality of sorting by 62% over ToT, while simultaneously reducing costs by 31%. We ensure that GoT is extensible with new thought transformations and thus can be used to spearhead new prompting schemes. This work brings the LLM reasoning closer to human thinking or brain mechanisms such as recurrence, both of which form complex networks.
·arxiv.org·
Graph of Thoughts: Solving Elaborate Problems with Large Language Models
Vector databases vs Graph databases
Vector databases vs Graph databases
Graph Databases should be the better choice for Retrieval Augmented Generation (RAG)! We have seen the debate RAG vs fine-tuning, but what about Vector… | 37 comments on LinkedIn
Vector databases vs Graph databases
·linkedin.com·
Vector databases vs Graph databases
Graph Instruction Tuning for Large Language Models
Graph Instruction Tuning for Large Language Models
🔥 Let #LLM understand graphs directly? GraphGPT made it! 📢 GraphGPT is a Graph Large Language Model, which aligns Large Language Models (LLMs) with Graphs…
·linkedin.com·
Graph Instruction Tuning for Large Language Models
Vectors need Graphs!
Vectors need Graphs!
Vectors need Graphs! Embedding vectors are a pivotal tool when using Generative AI. While vectors might initially seem an unlikely partner to graphs, their… | 61 comments on LinkedIn
Vectors need Graphs!
·linkedin.com·
Vectors need Graphs!
Constructing knowledge graphs from text using OpenAI functions: Leveraging knowledge graphs to power LangChain Applications
Constructing knowledge graphs from text using OpenAI functions: Leveraging knowledge graphs to power LangChain Applications
Editor's Note: This post was written by Tomaz Bratanic from the Neo4j team. Extracting structured information from unstructured data like text has been around for some time and is nothing new. However, LLMs brought a significant shift to the field of information extraction. If before you needed a team of
·blog.langchain.dev·
Constructing knowledge graphs from text using OpenAI functions: Leveraging knowledge graphs to power LangChain Applications