Found 453 bookmarks
Newest
GraphGPT
GraphGPT
🌟GraphGPT🌟 (385 stars in GitHub) is accepted by 🌟SIGIR'24🌟 (only 20.1% acceptance rate)! Thank Yuhao Yang, wei wei, and other co-authors for their precious…
GraphGPT
·linkedin.com·
GraphGPT
Tree-based RAG with RAPTOR and how knowledge graphs can come to the rescue to enhance answer quality.
Tree-based RAG with RAPTOR and how knowledge graphs can come to the rescue to enhance answer quality.
Long-Context models, such as Google Gemini Pro 1.5 or Large World Model, are probably changing the way we think about RAG (retrieval-augmented generation)… | 12 comments on LinkedIn
, how knowledge graphs can come to the rescue to enhance answer quality.
·linkedin.com·
Tree-based RAG with RAPTOR and how knowledge graphs can come to the rescue to enhance answer quality.
Jensen Huang in his keynote at NVIDIA GTC24 calls out three sources of data to integrate with LLMs: 1) vector databases, 2) ERP / CRM and 3) knowledge graphs
Jensen Huang in his keynote at NVIDIA GTC24 calls out three sources of data to integrate with LLMs: 1) vector databases, 2) ERP / CRM and 3) knowledge graphs
Wow, in Jensen Huang (CEO) his keynote at NVIDIA #GTC24, he calls out three sources of data to integrate with LLMs: 1) vector databases, 2) ERP / CRM and 3)…
Jensen Huang (CEO) his keynote at NVIDIA hashtag#GTC24, he calls out three sources of data to integrate with LLMs: 1) vector databases, 2) ERP / CRM and 3) *knowledge graphs*
·linkedin.com·
Jensen Huang in his keynote at NVIDIA GTC24 calls out three sources of data to integrate with LLMs: 1) vector databases, 2) ERP / CRM and 3) knowledge graphs
Kurt Cagle chatbot on Knowledge Graphs, Ontology, GenAI and Data
Kurt Cagle chatbot on Knowledge Graphs, Ontology, GenAI and Data
I want to thank Jay (JieBing) Yu, PhD for his hard work in creating a Mini-Me (https://lnkd.in/g6TR543j), a virtual assistant built on his fantastic LLM work…
Kurt is one of my favorite writers, a seasoned practitioner and deep thinker in the areas of Knowledge Graphs, Ontology, GenAI and Data
·linkedin.com·
Kurt Cagle chatbot on Knowledge Graphs, Ontology, GenAI and Data
Exploring the Potential of Large Language Models in Graph Generation
Exploring the Potential of Large Language Models in Graph Generation
Large language models (LLMs) have achieved great success in many fields, and recent works have studied exploring LLMs for graph discriminative tasks such as node classification. However, the abilities of LLMs for graph generation remain unexplored in the literature. Graph generation requires the LLM to generate graphs with given properties, which has valuable real-world applications such as drug discovery, while tends to be more challenging. In this paper, we propose LLM4GraphGen to explore the ability of LLMs for graph generation with systematical task designs and extensive experiments. Specifically, we propose several tasks tailored with comprehensive experiments to address key questions regarding LLMs' understanding of different graph structure rules, their ability to capture structural type distributions, and their utilization of domain knowledge for property-based graph generation. Our evaluations demonstrate that LLMs, particularly GPT-4, exhibit preliminary abilities in graph generation tasks, including rule-based and distribution-based generation. We also observe that popular prompting methods, such as few-shot and chain-of-thought prompting, do not consistently enhance performance. Besides, LLMs show potential in generating molecules with specific properties. These findings may serve as foundations for designing good LLMs based models for graph generation and provide valuable insights and further research.
·arxiv.org·
Exploring the Potential of Large Language Models in Graph Generation
OpenGraph: Open-Vocabulary Hierarchical 3D Graph Representation in Large-Scale Outdoor Environments
OpenGraph: Open-Vocabulary Hierarchical 3D Graph Representation in Large-Scale Outdoor Environments
Environment maps endowed with sophisticated semantics are pivotal for facilitating seamless interaction between robots and humans, enabling them to effectively carry out various tasks. Open-vocabulary maps, powered by Visual-Language models (VLMs), possess inherent advantages, including multimodal retrieval and open-set classes. However, existing open-vocabulary maps are constrained to closed indoor scenarios and VLM features, thereby diminishing their usability and inference capabilities. Moreover, the absence of topological relationships further complicates the accurate querying of specific instances. In this work, we propose OpenGraph, a representation of open-vocabulary hierarchical graph structure designed for large-scale outdoor environments. OpenGraph initially extracts instances and their captions from visual images using 2D foundation models, encoding the captions with features to enhance textual reasoning. Subsequently, 3D incremental panoramic mapping with feature embedding is achieved by projecting images onto LiDAR point clouds. Finally, the environment is segmented based on lane graph connectivity to construct a hierarchical graph. Validation results from real public dataset SemanticKITTI demonstrate that, even without fine-tuning the models, OpenGraph exhibits the ability to generalize to novel semantic classes and achieve the highest segmentation and query accuracy. The source code of OpenGraph is publicly available at https://github.com/BIT-DYN/OpenGraph.
·arxiv.org·
OpenGraph: Open-Vocabulary Hierarchical 3D Graph Representation in Large-Scale Outdoor Environments
TigerGraph CoPilot's public alpha release
TigerGraph CoPilot's public alpha release
🚀 Exciting News Alert! 🚀 We're over the moon to announce the launch of TigerGraph CoPilot's public alpha release! 🌟 🔗 Get ready to explore the future of…
·linkedin.com·
TigerGraph CoPilot's public alpha release
Knowledge Graph Large Language Model (KG-LLM) for Link Prediction
Knowledge Graph Large Language Model (KG-LLM) for Link Prediction
The task of predicting multiple links within knowledge graphs (KGs) stands as a challenge in the field of knowledge graph analysis, a challenge increasingly resolvable due to advancements in natural language processing (NLP) and KG embedding techniques. This paper introduces a novel methodology, the Knowledge Graph Large Language Model Framework (KG-LLM), which leverages pivotal NLP paradigms, including chain-of-thought (CoT) prompting and in-context learning (ICL), to enhance multi-hop link prediction in KGs. By converting the KG to a CoT prompt, our framework is designed to discern and learn the latent representations of entities and their interrelations. To show the efficacy of the KG-LLM Framework, we fine-tune three leading Large Language Models (LLMs) within this framework, employing both non-ICL and ICL tasks for a comprehensive evaluation. Further, we explore the framework's potential to provide LLMs with zero-shot capabilities for handling previously unseen prompts. Our experimental findings discover that integrating ICL and CoT not only augments the performance of our approach but also significantly boosts the models' generalization capacity, thereby ensuring more precise predictions in unfamiliar scenarios.
·arxiv.org·
Knowledge Graph Large Language Model (KG-LLM) for Link Prediction
DeepOnto: A Python Package for Ontology Engineering with Deep Learning
DeepOnto: A Python Package for Ontology Engineering with Deep Learning
Integrating deep learning techniques, particularly language models (LMs), with knowledge representation techniques like ontologies has raised widespread attention, urging the need of a platform that supports both paradigms. Although packages such as OWL API and Jena offer robust support for basic ontology processing features, they lack the capability to transform various types of information within ontologies into formats suitable for downstream deep learning-based applications. Moreover, widely-used ontology APIs are primarily Java-based while deep learning frameworks like PyTorch and Tensorflow are mainly for Python programming. To address the needs, we present DeepOnto, a Python package designed for ontology engineering with deep learning. The package encompasses a core ontology processing module founded on the widely-recognised and reliable OWL API, encapsulating its fundamental features in a more "Pythonic" manner and extending its capabilities to incorporate other essential components including reasoning, verbalisation, normalisation, taxonomy, projection, and more. Building on this module, DeepOnto offers a suite of tools, resources, and algorithms that support various ontology engineering tasks, such as ontology alignment and completion, by harnessing deep learning methods, primarily pre-trained LMs. In this paper, we also demonstrate the practical utility of DeepOnto through two use-cases: the Digital Health Coaching in Samsung Research UK and the Bio-ML track of the Ontology Alignment Evaluation Initiative (OAEI).
·arxiv.org·
DeepOnto: A Python Package for Ontology Engineering with Deep Learning
Enhancing RAG-based application accuracy by constructing and leveraging knowledge graphs
Enhancing RAG-based application accuracy by constructing and leveraging knowledge graphs
A practical guide to constructing and retrieving information from knowledge graphs in RAG applications with Neo4j and LangChain Editor's Note: the following is a guest blog post from Tomaz Bratanic, who focuses on Graph ML and GenAI research at Neo4j. Neo4j is a graph database and analytics company which helps
·blog.langchain.dev·
Enhancing RAG-based application accuracy by constructing and leveraging knowledge graphs
Personalized Audiobook Recommendations at Spotify Through Graph Neural Networks
Personalized Audiobook Recommendations at Spotify Through Graph Neural Networks
In the ever-evolving digital audio landscape, Spotify, well-known for its music and talk content, has recently introduced audiobooks to its vast user base. While promising, this move presents significant challenges for personalized recommendations. Unlike music and podcasts, audiobooks, initially available for a fee, cannot be easily skimmed before purchase, posing higher stakes for the relevance of recommendations. Furthermore, introducing a new content type into an existing platform confronts extreme data sparsity, as most users are unfamiliar with this new content type. Lastly, recommending content to millions of users requires the model to react fast and be scalable. To address these challenges, we leverage podcast and music user preferences and introduce 2T-HGNN, a scalable recommendation system comprising Heterogeneous Graph Neural Networks (HGNNs) and a Two Tower (2T) model. This novel approach uncovers nuanced item relationships while ensuring low latency and complexity. We decouple users from the HGNN graph and propose an innovative multi-link neighbor sampler. These choices, together with the 2T component, significantly reduce the complexity of the HGNN model. Empirical evaluations involving millions of users show significant improvement in the quality of personalized recommendations, resulting in a +46% increase in new audiobooks start rate and a +23% boost in streaming rates. Intriguingly, our model's impact extends beyond audiobooks, benefiting established products like podcasts.
·arxiv.org·
Personalized Audiobook Recommendations at Spotify Through Graph Neural Networks
Knowledge, Data and LLMs
Knowledge, Data and LLMs
Today is a pretty special day. In some sense, this is the day I’ve been waiting for all my life. The day that we figure out how to make…
·medium.com·
Knowledge, Data and LLMs
A word of caution from Netflix against blindly using cosine similarity as a measure of semantic similarity
A word of caution from Netflix against blindly using cosine similarity as a measure of semantic similarity
A word of caution from Netflix against blindly using cosine similarity as a measure of semantic similarity: https://lnkd.in/gX3tR4YK They study linear matrix… | 12 comments on LinkedIn
A word of caution from Netflix against blindly using cosine similarity as a measure of semantic similarity
·linkedin.com·
A word of caution from Netflix against blindly using cosine similarity as a measure of semantic similarity
Tony Seale Knowledge Graph Chatbot
Tony Seale Knowledge Graph Chatbot
I am thrilled to introduce a new AI Study Guide (https://lnkd.in/g4rPZVHW) dedicated to Tony Seale, another of my favorite authors, thought leaders, and…
Knowledge Graph
·linkedin.com·
Tony Seale Knowledge Graph Chatbot