GraphNews

4357 bookmarks
Custom sorting
Knowledge Graph Large Language Model (KG-LLM) for Link Prediction
Knowledge Graph Large Language Model (KG-LLM) for Link Prediction
The task of predicting multiple links within knowledge graphs (KGs) stands as a challenge in the field of knowledge graph analysis, a challenge increasingly resolvable due to advancements in natural language processing (NLP) and KG embedding techniques. This paper introduces a novel methodology, the Knowledge Graph Large Language Model Framework (KG-LLM), which leverages pivotal NLP paradigms, including chain-of-thought (CoT) prompting and in-context learning (ICL), to enhance multi-hop link prediction in KGs. By converting the KG to a CoT prompt, our framework is designed to discern and learn the latent representations of entities and their interrelations. To show the efficacy of the KG-LLM Framework, we fine-tune three leading Large Language Models (LLMs) within this framework, employing both non-ICL and ICL tasks for a comprehensive evaluation. Further, we explore the framework's potential to provide LLMs with zero-shot capabilities for handling previously unseen prompts. Our experimental findings discover that integrating ICL and CoT not only augments the performance of our approach but also significantly boosts the models' generalization capacity, thereby ensuring more precise predictions in unfamiliar scenarios.
·arxiv.org·
Knowledge Graph Large Language Model (KG-LLM) for Link Prediction
On Evaluating Taxonomies | LinkedIn
On Evaluating Taxonomies | LinkedIn
Bob Kasenchak, Factor One of the regular tasks we undertake when starting an engagement with a new client involves cataloging and evaluating existing taxonomies in their business information ecosystem. But not all taxonomies are created equal; or, perhaps more specifically, not all taxonomies serve
·linkedin.com·
On Evaluating Taxonomies | LinkedIn
DeepOnto: A Python Package for Ontology Engineering with Deep Learning
DeepOnto: A Python Package for Ontology Engineering with Deep Learning
Integrating deep learning techniques, particularly language models (LMs), with knowledge representation techniques like ontologies has raised widespread attention, urging the need of a platform that supports both paradigms. Although packages such as OWL API and Jena offer robust support for basic ontology processing features, they lack the capability to transform various types of information within ontologies into formats suitable for downstream deep learning-based applications. Moreover, widely-used ontology APIs are primarily Java-based while deep learning frameworks like PyTorch and Tensorflow are mainly for Python programming. To address the needs, we present DeepOnto, a Python package designed for ontology engineering with deep learning. The package encompasses a core ontology processing module founded on the widely-recognised and reliable OWL API, encapsulating its fundamental features in a more "Pythonic" manner and extending its capabilities to incorporate other essential components including reasoning, verbalisation, normalisation, taxonomy, projection, and more. Building on this module, DeepOnto offers a suite of tools, resources, and algorithms that support various ontology engineering tasks, such as ontology alignment and completion, by harnessing deep learning methods, primarily pre-trained LMs. In this paper, we also demonstrate the practical utility of DeepOnto through two use-cases: the Digital Health Coaching in Samsung Research UK and the Bio-ML track of the Ontology Alignment Evaluation Initiative (OAEI).
·arxiv.org·
DeepOnto: A Python Package for Ontology Engineering with Deep Learning
Enhancing RAG-based application accuracy by constructing and leveraging knowledge graphs
Enhancing RAG-based application accuracy by constructing and leveraging knowledge graphs
A practical guide to constructing and retrieving information from knowledge graphs in RAG applications with Neo4j and LangChain Editor's Note: the following is a guest blog post from Tomaz Bratanic, who focuses on Graph ML and GenAI research at Neo4j. Neo4j is a graph database and analytics company which helps
·blog.langchain.dev·
Enhancing RAG-based application accuracy by constructing and leveraging knowledge graphs
An Intro to Building Knowledge Graphs
An Intro to Building Knowledge Graphs
Editor’s note: Sumit Pal is a speaker for ODSC East this April 23-25. Be sure to check out his talk, “Building Knowledge Graphs,” there! Graphs and Knowledge Graphs (KGs) are all around us. We use them every day without realizing it. GPS leverages graph data structures and databases to plot...
·opendatascience.com·
An Intro to Building Knowledge Graphs
Personalized Audiobook Recommendations at Spotify Through Graph Neural Networks
Personalized Audiobook Recommendations at Spotify Through Graph Neural Networks
In the ever-evolving digital audio landscape, Spotify, well-known for its music and talk content, has recently introduced audiobooks to its vast user base. While promising, this move presents significant challenges for personalized recommendations. Unlike music and podcasts, audiobooks, initially available for a fee, cannot be easily skimmed before purchase, posing higher stakes for the relevance of recommendations. Furthermore, introducing a new content type into an existing platform confronts extreme data sparsity, as most users are unfamiliar with this new content type. Lastly, recommending content to millions of users requires the model to react fast and be scalable. To address these challenges, we leverage podcast and music user preferences and introduce 2T-HGNN, a scalable recommendation system comprising Heterogeneous Graph Neural Networks (HGNNs) and a Two Tower (2T) model. This novel approach uncovers nuanced item relationships while ensuring low latency and complexity. We decouple users from the HGNN graph and propose an innovative multi-link neighbor sampler. These choices, together with the 2T component, significantly reduce the complexity of the HGNN model. Empirical evaluations involving millions of users show significant improvement in the quality of personalized recommendations, resulting in a +46% increase in new audiobooks start rate and a +23% boost in streaming rates. Intriguingly, our model's impact extends beyond audiobooks, benefiting established products like podcasts.
·arxiv.org·
Personalized Audiobook Recommendations at Spotify Through Graph Neural Networks
Demystifying Embedding Spaces using Large Language Models
Demystifying Embedding Spaces using Large Language Models
Embeddings are telling a story that we haven't been listening to. Embeddings are everywhere: they power search, recommendations, RAG, and much more. They are…
·linkedin.com·
Demystifying Embedding Spaces using Large Language Models
Knowledge, Data and LLMs
Knowledge, Data and LLMs
Today is a pretty special day. In some sense, this is the day I’ve been waiting for all my life. The day that we figure out how to make…
·medium.com·
Knowledge, Data and LLMs
A word of caution from Netflix against blindly using cosine similarity as a measure of semantic similarity
A word of caution from Netflix against blindly using cosine similarity as a measure of semantic similarity
A word of caution from Netflix against blindly using cosine similarity as a measure of semantic similarity: https://lnkd.in/gX3tR4YK They study linear matrix… | 12 comments on LinkedIn
A word of caution from Netflix against blindly using cosine similarity as a measure of semantic similarity
·linkedin.com·
A word of caution from Netflix against blindly using cosine similarity as a measure of semantic similarity
Graph neural networks
Graph neural networks
Nature Reviews Methods Primers - Graph neural networks are a class of deep learning methods that can model physical systems, generate new molecules and identify drug candidates. This Primer...
·nature.com·
Graph neural networks
A Survey of Graph Neural Networks in Real world: Imbalance, Noise, Privacy and OOD Challenges
A Survey of Graph Neural Networks in Real world: Imbalance, Noise, Privacy and OOD Challenges
Graph-structured data exhibits universality and widespread applicability across diverse domains, such as social network analysis, biochemistry, financial fraud detection, and network security. Significant strides have been made in leveraging Graph Neural Networks (GNNs) to achieve remarkable success in these areas. However, in real-world scenarios, the training environment for models is often far from ideal, leading to substantial performance degradation of GNN models due to various unfavorable factors, including imbalance in data distribution, the presence of noise in erroneous data, privacy protection of sensitive information, and generalization capability for out-of-distribution (OOD) scenarios. To tackle these issues, substantial efforts have been devoted to improving the performance of GNN models in practical real-world scenarios, as well as enhancing their reliability and robustness. In this paper, we present a comprehensive survey that systematically reviews existing GNN models, focusing on solutions to the four mentioned real-world challenges including imbalance, noise, privacy, and OOD in practical scenarios that many existing reviews have not considered. Specifically, we first highlight the four key challenges faced by existing GNNs, paving the way for our exploration of real-world GNN models. Subsequently, we provide detailed discussions on these four aspects, dissecting how these solutions contribute to enhancing the reliability and robustness of GNN models. Last but not least, we outline promising directions and offer future perspectives in the field.
·arxiv.org·
A Survey of Graph Neural Networks in Real world: Imbalance, Noise, Privacy and OOD Challenges
Tony Seale Knowledge Graph Chatbot
Tony Seale Knowledge Graph Chatbot
I am thrilled to introduce a new AI Study Guide (https://lnkd.in/g4rPZVHW) dedicated to Tony Seale, another of my favorite authors, thought leaders, and…
Knowledge Graph
·linkedin.com·
Tony Seale Knowledge Graph Chatbot
PyGraft: Configurable Generation of Synthetic Schemas and Knowledge Graphs at Your Fingertips
PyGraft: Configurable Generation of Synthetic Schemas and Knowledge Graphs at Your Fingertips
Knowledge graphs (KGs) have emerged as a prominent data representation and management paradigm. Being usually underpinned by a schema (e.g., an ontology), KGs capture not only factual information but also contextual knowledge. In some tasks, a few KGs established themselves as standard benchmarks. However, recent works outline that relying on a limited collection of datasets is not sufficient to assess the generalization capability of an approach. In some data-sensitive fields such as education or medicine, access to public datasets is even more limited. To remedy the aforementioned issues, we release PyGraft, a Python-based tool that generates highly customized, domain-agnostic schemas and KGs. The synthesized schemas encompass various RDFS and OWL constructs, while the synthesized KGs emulate the characteristics and scale of real-world KGs. Logical consistency of the generated resources is ultimately ensured by running a description logic (DL) reasoner. By providing a way of generating both a schema and KG in a single pipeline, PyGraft's aim is to empower the generation of a more diverse array of KGs for benchmarking novel approaches in areas such as graph-based machine learning (ML), or more generally KG processing. In graph-based ML in particular, this should foster a more holistic evaluation of model performance and generalization capability, thereby going beyond the limited collection of available benchmarks. PyGraft is available at: https://github.com/nicolas-hbt/pygraft.
·arxiv.org·
PyGraft: Configurable Generation of Synthetic Schemas and Knowledge Graphs at Your Fingertips
KGLM-Loop: A Bi-Directional Data Flywheel for Knowledge Graph Refinement and Hallucination Detection in Large Language Models
KGLM-Loop: A Bi-Directional Data Flywheel for Knowledge Graph Refinement and Hallucination Detection in Large Language Models
KGLM-Loop: A Bi-Directional Data Flywheel for Knowledge Graph Refinement and Hallucination Detection in Large Language Models ☀ 🌑 In the pursuit of…
KGLM-Loop: A Bi-Directional Data Flywheel for Knowledge Graph Refinement and Hallucination Detection in Large Language Models
·linkedin.com·
KGLM-Loop: A Bi-Directional Data Flywheel for Knowledge Graph Refinement and Hallucination Detection in Large Language Models
Decoding the Semantic Layer
Decoding the Semantic Layer
We've been hearing the term "Semantic layer" without truly understanding the semantics of it. So, here is episode 11 of #DnABytes and today's topic is:…
Decoding the Semantic Layer
·linkedin.com·
Decoding the Semantic Layer