GraphNews

4503 bookmarks
Custom sorting
ChatGPT + RDF storytelling
ChatGPT + RDF storytelling
What you can do with gpt-4 is pretty insane. You can ask it to create an RDF description from the first chapter of a story: https://lnkd.in/emuxaX6d (you can… | 19 comments on LinkedIn
·linkedin.com·
ChatGPT + RDF storytelling
Latest Gartner research on semantics
Latest Gartner research on semantics
Latest Gartner research on semantics suggests the following: 1) Data silos become entrenched and limit an organization’s capacity to draw insights from its…
Latest Gartner research on semantics
·linkedin.com·
Latest Gartner research on semantics
Introducing MechGPT: 1) fine-tuning an LLM, and 2) generating a knowledge graph
Introducing MechGPT: 1) fine-tuning an LLM, and 2) generating a knowledge graph
Introducing MechGPT 🦾🤖 This project by Markus J. Buehler is one of the coolest use cases of 1) fine-tuning an LLM, and 2) generating a knowledge graph that… | 33 comments on LinkedIn
Introducing MechGPT 🦾🤖This project by Markus J. Buehler is one of the coolest use cases of 1) fine-tuning an LLM, and 2) generating a knowledge graph that we’ve seen (powered by LlamaIndex
·linkedin.com·
Introducing MechGPT: 1) fine-tuning an LLM, and 2) generating a knowledge graph
Reasoning on Graphs: Faithful and Interpretable Large Language Model Reasoning
Reasoning on Graphs: Faithful and Interpretable Large Language Model Reasoning
Large language models (LLMs) have demonstrated impressive reasoning abilities in complex tasks. However, they lack up-to-date knowledge and experience hallucinations during reasoning, which can lead to incorrect reasoning processes and diminish their performance and trustworthiness. Knowledge graphs (KGs), which capture vast amounts of facts in a structured format, offer a reliable source of knowledge for reasoning. Nevertheless, existing KG-based LLM reasoning methods only treat KGs as factual knowledge bases and overlook the importance of their structural information for reasoning. In this paper, we propose a novel method called reasoning on graphs (RoG) that synergizes LLMs with KGs to enable faithful and interpretable reasoning. Specifically, we present a planning-retrieval-reasoning framework, where RoG first generates relation paths grounded by KGs as faithful plans. These plans are then used to retrieve valid reasoning paths from the KGs for LLMs to conduct faithful reasoning. Furthermore, RoG not only distills knowledge from KGs to improve the reasoning ability of LLMs through training but also allows seamless integration with any arbitrary LLMs during inference. Extensive experiments on two benchmark KGQA datasets demonstrate that RoG achieves state-of-the-art performance on KG reasoning tasks and generates faithful and interpretable reasoning results.
·arxiv.org·
Reasoning on Graphs: Faithful and Interpretable Large Language Model Reasoning
What's an Ontology Again?
What's an Ontology Again?
I've been struggling for a while with the definition of "ontology". One thing that finally occurred to me is that our notion of ontologies for the most part… | 53 comments on LinkedIn
·linkedin.com·
What's an Ontology Again?
Working on a LangChain template that adds a custom graph conversational memory to the Neo4j Cypher chain
Working on a LangChain template that adds a custom graph conversational memory to the Neo4j Cypher chain
Working on a LangChain template that adds a custom graph conversational memory to the Neo4j Cypher chain, which uses LLMs to generate Cypher statements. This…
Working on a LangChain template that adds a custom graph conversational memory to the Neo4j Cypher chain
·linkedin.com·
Working on a LangChain template that adds a custom graph conversational memory to the Neo4j Cypher chain
Talk like a Graph: Encoding Graphs for Large Language Models
Talk like a Graph: Encoding Graphs for Large Language Models
Graphs are a powerful tool for representing and analyzing complex relationships in real-world applications such as social networks, recommender systems, and computational finance. Reasoning on graphs is essential for drawing inferences about the relationships between entities in a complex system, and to identify hidden patterns and trends. Despite the remarkable progress in automated reasoning with natural text, reasoning on graphs with large language models (LLMs) remains an understudied problem. In this work, we perform the first comprehensive study of encoding graph-structured data as text for consumption by LLMs. We show that LLM performance on graph reasoning tasks varies on three fundamental levels: (1) the graph encoding method, (2) the nature of the graph task itself, and (3) interestingly, the very structure of the graph considered. These novel results provide valuable insight on strategies for encoding graphs as text. Using these insights we illustrate how the correct choice of encoders can boost performance on graph reasoning tasks inside LLMs by 4.8% to 61.8%, depending on the task.
·arxiv.org·
Talk like a Graph: Encoding Graphs for Large Language Models
Graph Theory and Its Implications: A Quantitative Point of View
Graph Theory and Its Implications: A Quantitative Point of View
Graph Theory and Its Implications: A Quantitative Point of View In the technical universe of quantitative finance, graph theory has been gaining prominence… | 15 comments on LinkedIn
Graph Theory and Its Implications: A Quantitative Point of View
·linkedin.com·
Graph Theory and Its Implications: A Quantitative Point of View
Charting the Graphical Roadmap to Smarter AI
Charting the Graphical Roadmap to Smarter AI
Subscribe • Previous Issues Boosting LLMs with External Knowledge: The Case for Knowledge Graphs When we wrote our post on Graph Intelligence in early 2022, our goal was to highlight techniques for deriving insights about relationships and connections from structured data using graph analytics and machine learning. We focused mainly on business intelligence and machine learning applications, showcasing how technology companies were applying graph neural networks (GNNs) in areas like recommendations and fraud detection.
·gradientflow.substack.com·
Charting the Graphical Roadmap to Smarter AI
Knowledge Graph Alliance (KGA) Launches in Brussels
Knowledge Graph Alliance (KGA) Launches in Brussels
Knowledge Graph Alliance (KGA) Launches in Brussels The Knowledge Graph Alliance (KGA) is proud to announce its inauguration on Friday, 10th November, in… | 23 comments on LinkedIn
Knowledge Graph Alliance (KGA) Launches in Brussels
·linkedin.com·
Knowledge Graph Alliance (KGA) Launches in Brussels
Graph Deep Learning for Time Series Forecasting
Graph Deep Learning for Time Series Forecasting
Graph-based deep learning methods have become popular tools to process collections of correlated time series. Differently from traditional multivariate forecasting methods, neural graph-based predictors take advantage of pairwise relationships by conditioning forecasts on a (possibly dynamic) graph spanning the time series collection. The conditioning can take the form of an architectural inductive bias on the neural forecasting architecture, resulting in a family of deep learning models called spatiotemporal graph neural networks. Such relational inductive biases enable the training of global forecasting models on large time-series collections, while at the same time localizing predictions w.r.t. each element in the set (i.e., graph nodes) by accounting for local correlations among them (i.e., graph edges). Indeed, recent theoretical and practical advances in graph neural networks and deep learning for time series forecasting make the adoption of such processing frameworks appealing and timely. However, most of the studies in the literature focus on proposing variations of existing neural architectures by taking advantage of modern deep learning practices, while foundational and methodological aspects have not been subject to systematic investigation. To fill the gap, this paper aims to introduce a comprehensive methodological framework that formalizes the forecasting problem and provides design principles for graph-based predictive models and methods to assess their performance. At the same time, together with an overview of the field, we provide design guidelines, recommendations, and best practices, as well as an in-depth discussion of open challenges and future research directions.
·arxiv.org·
Graph Deep Learning for Time Series Forecasting
Supply chain data analysis and visualization using Amazon Neptune and the Neptune workbench | Amazon Web Services
Supply chain data analysis and visualization using Amazon Neptune and the Neptune workbench | Amazon Web Services
Many global corporations are managing multiple supply chains, and they depend on those operations to not only deliver goods on time but to respond to divergent customer and supplier needs. According to a McKinsey study, it’s estimated that significant disruptions to production now occur every 3.7 years on average, adding new urgency to supply chain […]
·aws.amazon.com·
Supply chain data analysis and visualization using Amazon Neptune and the Neptune workbench | Amazon Web Services