GraphNews

3484 bookmarks
Custom sorting
GraphGPT
GraphGPT
🌟GraphGPT🌟 (385 stars in GitHub) is accepted by 🌟SIGIR'24🌟 (only 20.1% acceptance rate)! Thank Yuhao Yang, wei wei, and other co-authors for their precious…
GraphGPT
·linkedin.com·
GraphGPT
The Taxonomy Tortoise and the ML Hare
The Taxonomy Tortoise and the ML Hare
Here is my third blog entry of 2024. In this blog, I consider the slow pace of taxonomy and ontology building vs the rapid pace of machine learning models… | 24 comments on LinkedIn
The Taxonomy Tortoise and the ML Hare
·linkedin.com·
The Taxonomy Tortoise and the ML Hare
Tree-based RAG with RAPTOR and how knowledge graphs can come to the rescue to enhance answer quality.
Tree-based RAG with RAPTOR and how knowledge graphs can come to the rescue to enhance answer quality.
Long-Context models, such as Google Gemini Pro 1.5 or Large World Model, are probably changing the way we think about RAG (retrieval-augmented generation)… | 12 comments on LinkedIn
, how knowledge graphs can come to the rescue to enhance answer quality.
·linkedin.com·
Tree-based RAG with RAPTOR and how knowledge graphs can come to the rescue to enhance answer quality.
Why Graphs? Graph databases are the most powerful tools for managing data because of their structure and flexibility. Here are 6 reasons why
Why Graphs? Graph databases are the most powerful tools for managing data because of their structure and flexibility. Here are 6 reasons why
Why Graphs? Graph databases are the most powerful tools for managing data because of their structure and flexibility. Here are 6 reasons why: 1. They are…
Why Graphs? Graph databases are the most powerful tools for managing data because of their structure and flexibility. Here are 6 reasons why
·linkedin.com·
Why Graphs? Graph databases are the most powerful tools for managing data because of their structure and flexibility. Here are 6 reasons why
Jensen Huang in his keynote at NVIDIA GTC24 calls out three sources of data to integrate with LLMs: 1) vector databases, 2) ERP / CRM and 3) knowledge graphs
Jensen Huang in his keynote at NVIDIA GTC24 calls out three sources of data to integrate with LLMs: 1) vector databases, 2) ERP / CRM and 3) knowledge graphs
Wow, in Jensen Huang (CEO) his keynote at NVIDIA #GTC24, he calls out three sources of data to integrate with LLMs: 1) vector databases, 2) ERP / CRM and 3)…
Jensen Huang (CEO) his keynote at NVIDIA hashtag#GTC24, he calls out three sources of data to integrate with LLMs: 1) vector databases, 2) ERP / CRM and 3) *knowledge graphs*
·linkedin.com·
Jensen Huang in his keynote at NVIDIA GTC24 calls out three sources of data to integrate with LLMs: 1) vector databases, 2) ERP / CRM and 3) knowledge graphs
Kurt Cagle chatbot on Knowledge Graphs, Ontology, GenAI and Data
Kurt Cagle chatbot on Knowledge Graphs, Ontology, GenAI and Data
I want to thank Jay (JieBing) Yu, PhD for his hard work in creating a Mini-Me (https://lnkd.in/g6TR543j), a virtual assistant built on his fantastic LLM work…
Kurt is one of my favorite writers, a seasoned practitioner and deep thinker in the areas of Knowledge Graphs, Ontology, GenAI and Data
·linkedin.com·
Kurt Cagle chatbot on Knowledge Graphs, Ontology, GenAI and Data
Towards Graph Foundation Models for Personalization
Towards Graph Foundation Models for Personalization
In the realm of personalization, integrating diverse information sources such as consumption signals and content-based representations is becoming increasingly critical to build state-of-the-art solutions. In this regard, two of the biggest trends in research around this subject are Graph Neural Networks (GNNs) and Foundation Models (FMs). While GNNs emerged as a popular solution in industry for powering personalization at scale, FMs have only recently caught attention for their promising performance in personalization tasks like ranking and retrieval. In this paper, we present a graph-based foundation modeling approach tailored to personalization. Central to this approach is a Heterogeneous GNN (HGNN) designed to capture multi-hop content and consumption relationships across a range of recommendable item types. To ensure the generality required from a Foundation Model, we employ a Large Language Model (LLM) text-based featurization of nodes that accommodates all item types, and construct the graph using co-interaction signals, which inherently transcend content specificity. To facilitate practical generalization, we further couple the HGNN with an adaptation mechanism based on a two-tower (2T) architecture, which also operates agnostically to content type. This multi-stage approach ensures high scalability; while the HGNN produces general purpose embeddings, the 2T component models in a continuous space the sheer size of user-item interaction data. Our comprehensive approach has been rigorously tested and proven effective in delivering recommendations across a diverse array of products within a real-world, industrial audio streaming platform.
·arxiv.org·
Towards Graph Foundation Models for Personalization
Exploring the Potential of Large Language Models in Graph Generation
Exploring the Potential of Large Language Models in Graph Generation
Large language models (LLMs) have achieved great success in many fields, and recent works have studied exploring LLMs for graph discriminative tasks such as node classification. However, the abilities of LLMs for graph generation remain unexplored in the literature. Graph generation requires the LLM to generate graphs with given properties, which has valuable real-world applications such as drug discovery, while tends to be more challenging. In this paper, we propose LLM4GraphGen to explore the ability of LLMs for graph generation with systematical task designs and extensive experiments. Specifically, we propose several tasks tailored with comprehensive experiments to address key questions regarding LLMs' understanding of different graph structure rules, their ability to capture structural type distributions, and their utilization of domain knowledge for property-based graph generation. Our evaluations demonstrate that LLMs, particularly GPT-4, exhibit preliminary abilities in graph generation tasks, including rule-based and distribution-based generation. We also observe that popular prompting methods, such as few-shot and chain-of-thought prompting, do not consistently enhance performance. Besides, LLMs show potential in generating molecules with specific properties. These findings may serve as foundations for designing good LLMs based models for graph generation and provide valuable insights and further research.
·arxiv.org·
Exploring the Potential of Large Language Models in Graph Generation
Graph Neural Network for Crawling Target Nodes in Social Networks
Graph Neural Network for Crawling Target Nodes in Social Networks
Social networks crawling is in the focus of active research the last years. One of the challenging task is to collect target nodes in an initially unknown graph given a budget of crawling steps. Predicting a node property based on its partially known neighbourhood is at the heart of a successful crawler. In this paper we adopt graph neural networks for this purpose and show they are competitive to traditional classifiers and are better for individual cases. Additionally we suggest a training sample boosting technique, which helps to diversify the training set at early stages of crawling and thus improves the predictor quality. The experimental study on three types of target set topology indicates GNN based approach has a potential in crawling task, especially in the case of distributed target nodes.
·arxiv.org·
Graph Neural Network for Crawling Target Nodes in Social Networks
Graphs Unveiled: Graph Neural Networks and Graph Generation
Graphs Unveiled: Graph Neural Networks and Graph Generation
One of the hot topics in machine learning is the field of GNN. The complexity of graph data has imposed significant challenges on existing machine learning algorithms. Recently, many studies on extending deep learning approaches for graph data have emerged. This paper represents a survey, providing a comprehensive overview of Graph Neural Networks (GNNs). We discuss the applications of graph neural networks across various domains. Finally, we present an advanced field in GNNs: graph generation.
·arxiv.org·
Graphs Unveiled: Graph Neural Networks and Graph Generation
Oreos and Ontology
Oreos and Ontology
Grab some Oreos and let's chow down some #ontology basics. As an English Literature major, we did talk about quite a few philosophical concepts, but the… | 25 comments on LinkedIn
·linkedin.com·
Oreos and Ontology
OpenGraph: Open-Vocabulary Hierarchical 3D Graph Representation in Large-Scale Outdoor Environments
OpenGraph: Open-Vocabulary Hierarchical 3D Graph Representation in Large-Scale Outdoor Environments
Environment maps endowed with sophisticated semantics are pivotal for facilitating seamless interaction between robots and humans, enabling them to effectively carry out various tasks. Open-vocabulary maps, powered by Visual-Language models (VLMs), possess inherent advantages, including multimodal retrieval and open-set classes. However, existing open-vocabulary maps are constrained to closed indoor scenarios and VLM features, thereby diminishing their usability and inference capabilities. Moreover, the absence of topological relationships further complicates the accurate querying of specific instances. In this work, we propose OpenGraph, a representation of open-vocabulary hierarchical graph structure designed for large-scale outdoor environments. OpenGraph initially extracts instances and their captions from visual images using 2D foundation models, encoding the captions with features to enhance textual reasoning. Subsequently, 3D incremental panoramic mapping with feature embedding is achieved by projecting images onto LiDAR point clouds. Finally, the environment is segmented based on lane graph connectivity to construct a hierarchical graph. Validation results from real public dataset SemanticKITTI demonstrate that, even without fine-tuning the models, OpenGraph exhibits the ability to generalize to novel semantic classes and achieve the highest segmentation and query accuracy. The source code of OpenGraph is publicly available at https://github.com/BIT-DYN/OpenGraph.
·arxiv.org·
OpenGraph: Open-Vocabulary Hierarchical 3D Graph Representation in Large-Scale Outdoor Environments
TigerGraph CoPilot's public alpha release
TigerGraph CoPilot's public alpha release
🚀 Exciting News Alert! 🚀 We're over the moon to announce the launch of TigerGraph CoPilot's public alpha release! 🌟 🔗 Get ready to explore the future of…
·linkedin.com·
TigerGraph CoPilot's public alpha release
Knowledge Graph Large Language Model (KG-LLM) for Link Prediction
Knowledge Graph Large Language Model (KG-LLM) for Link Prediction
The task of predicting multiple links within knowledge graphs (KGs) stands as a challenge in the field of knowledge graph analysis, a challenge increasingly resolvable due to advancements in natural language processing (NLP) and KG embedding techniques. This paper introduces a novel methodology, the Knowledge Graph Large Language Model Framework (KG-LLM), which leverages pivotal NLP paradigms, including chain-of-thought (CoT) prompting and in-context learning (ICL), to enhance multi-hop link prediction in KGs. By converting the KG to a CoT prompt, our framework is designed to discern and learn the latent representations of entities and their interrelations. To show the efficacy of the KG-LLM Framework, we fine-tune three leading Large Language Models (LLMs) within this framework, employing both non-ICL and ICL tasks for a comprehensive evaluation. Further, we explore the framework's potential to provide LLMs with zero-shot capabilities for handling previously unseen prompts. Our experimental findings discover that integrating ICL and CoT not only augments the performance of our approach but also significantly boosts the models' generalization capacity, thereby ensuring more precise predictions in unfamiliar scenarios.
·arxiv.org·
Knowledge Graph Large Language Model (KG-LLM) for Link Prediction
On Evaluating Taxonomies | LinkedIn
On Evaluating Taxonomies | LinkedIn
Bob Kasenchak, Factor One of the regular tasks we undertake when starting an engagement with a new client involves cataloging and evaluating existing taxonomies in their business information ecosystem. But not all taxonomies are created equal; or, perhaps more specifically, not all taxonomies serve
·linkedin.com·
On Evaluating Taxonomies | LinkedIn
DeepOnto: A Python Package for Ontology Engineering with Deep Learning
DeepOnto: A Python Package for Ontology Engineering with Deep Learning
Integrating deep learning techniques, particularly language models (LMs), with knowledge representation techniques like ontologies has raised widespread attention, urging the need of a platform that supports both paradigms. Although packages such as OWL API and Jena offer robust support for basic ontology processing features, they lack the capability to transform various types of information within ontologies into formats suitable for downstream deep learning-based applications. Moreover, widely-used ontology APIs are primarily Java-based while deep learning frameworks like PyTorch and Tensorflow are mainly for Python programming. To address the needs, we present DeepOnto, a Python package designed for ontology engineering with deep learning. The package encompasses a core ontology processing module founded on the widely-recognised and reliable OWL API, encapsulating its fundamental features in a more "Pythonic" manner and extending its capabilities to incorporate other essential components including reasoning, verbalisation, normalisation, taxonomy, projection, and more. Building on this module, DeepOnto offers a suite of tools, resources, and algorithms that support various ontology engineering tasks, such as ontology alignment and completion, by harnessing deep learning methods, primarily pre-trained LMs. In this paper, we also demonstrate the practical utility of DeepOnto through two use-cases: the Digital Health Coaching in Samsung Research UK and the Bio-ML track of the Ontology Alignment Evaluation Initiative (OAEI).
·arxiv.org·
DeepOnto: A Python Package for Ontology Engineering with Deep Learning
Enhancing RAG-based application accuracy by constructing and leveraging knowledge graphs
Enhancing RAG-based application accuracy by constructing and leveraging knowledge graphs
A practical guide to constructing and retrieving information from knowledge graphs in RAG applications with Neo4j and LangChain Editor's Note: the following is a guest blog post from Tomaz Bratanic, who focuses on Graph ML and GenAI research at Neo4j. Neo4j is a graph database and analytics company which helps
·blog.langchain.dev·
Enhancing RAG-based application accuracy by constructing and leveraging knowledge graphs
An Intro to Building Knowledge Graphs
An Intro to Building Knowledge Graphs
Editor’s note: Sumit Pal is a speaker for ODSC East this April 23-25. Be sure to check out his talk, “Building Knowledge Graphs,” there! Graphs and Knowledge Graphs (KGs) are all around us. We use them every day without realizing it. GPS leverages graph data structures and databases to plot...
·opendatascience.com·
An Intro to Building Knowledge Graphs