GraphNews

4343 bookmarks
Custom sorting
Do Similar Entities have Similar Embeddings?
Do Similar Entities have Similar Embeddings?
Knowledge graph embedding models (KGEMs) developed for link prediction learn vector representations for graph entities, known as embeddings. A common tacit assumption is the KGE entity similarity assumption, which states that these KGEMs retain the graph's structure within their embedding space, i.e., position similar entities close to one another. This desirable property make KGEMs widely used in downstream tasks such as recommender systems or drug repurposing. Yet, the alignment of graph similarity with embedding space similarity has rarely been formally evaluated. Typically, KGEMs are assessed based on their sole link prediction capabilities, using ranked-based metrics such as Hits@K or Mean Rank. This paper challenges the prevailing assumption that entity similarity in the graph is inherently mirrored in the embedding space. Therefore, we conduct extensive experiments to measure the capability of KGEMs to cluster similar entities together, and investigate the nature of the underlying factors. Moreover, we study if different KGEMs expose a different notion of similarity. Datasets, pre-trained embeddings and code are available at: https://github.com/nicolas-hbt/similar-embeddings.
·arxiv.org·
Do Similar Entities have Similar Embeddings?
Data gauging, covariance and equivariance | Maurice Weiler
Data gauging, covariance and equivariance | Maurice Weiler
The numerical representation of data is often ambiguous. This leads to a gauge theoretic view on data, requiring covariant or equivariant neural networks which are reviewed in this blog post.
·maurice-weiler.gitlab.io·
Data gauging, covariance and equivariance | Maurice Weiler
Neural algorithmic reasoning
Neural algorithmic reasoning
In this article, we will talk about classical computation: the kind of computation typically found in an undergraduate Computer Science course on Algorithms and Data Structures [1]. Think shortest path-finding, sorting, clever ways to break problems down into simpler problems, incredible ways to organise data for efficient retrieval and updates.
·thegradient.pub·
Neural algorithmic reasoning
Transforming Unstructured Text into RDF Triples with AI. | LinkedIn
Transforming Unstructured Text into RDF Triples with AI. | LinkedIn
Over the past few months, I've been immersed in an exciting experiment, leveraging OpenAI's advanced language models to transform unstructured text into RDF (Resource Description Framework) triples. The journey, as thrilling as it has been, is filled with ongoing challenges and learning experiences.
·linkedin.com·
Transforming Unstructured Text into RDF Triples with AI. | LinkedIn
How the LDMs in knowledge graphs can complement LLMs - DataScienceCentral.com
How the LDMs in knowledge graphs can complement LLMs - DataScienceCentral.com
Large language models (LLMs) fit parameters (features in data topography) to a particular dataset, such as text scraped off the web and conformed to a training set.  Logical data models (LDMs), by contrast, model what becomes shared within entire systems. They bring together the data in a system with the help of various kinds of… Read More »How the LDMs in knowledge graphs can complement LLMs
·datasciencecentral.com·
How the LDMs in knowledge graphs can complement LLMs - DataScienceCentral.com
Knowledge Graphs: Breaking the Ice
Knowledge Graphs: Breaking the Ice
This post talks about the nature and key characteristics of knowledge graphs. It also outlines the benefits of formal semantics and how…
·ontotext.medium.com·
Knowledge Graphs: Breaking the Ice
Graph Learning Meets Artificial Intelligence
Graph Learning Meets Artificial Intelligence
By request, here are the slides from our #neurips2023 presentation yesterday! We really enjoyed the opportunity to present the different aspects of the work… | 18 comments on LinkedIn
·linkedin.com·
Graph Learning Meets Artificial Intelligence
Language, Graphs, and AI in Industry
Language, Graphs, and AI in Industry
Here's the video for my talk @ K1st World Symposium 2023 about the intersections of KGs and LLMs: https://lnkd.in/gugB8Yjj and also the slides, plus related…
Language, Graphs, and AI in Industry
·linkedin.com·
Language, Graphs, and AI in Industry
Knowledge Graphs - Foundations and Applications
Knowledge Graphs - Foundations and Applications
Despite the fact that it affects our lives on a daily basis, most of us are unfamiliar with the concept of a knowledge graph. When we ask Alexa about tomorrow's weather or use Google to look up the latest news on climate change, knowledge graphs serve as the foundation of today's cutting-edge information systems. In addition, knowledge graphs have the potential to elucidate, assess, and substantiate information produced by Deep Learning models, such as Chat-GPT and other large language models. Knowledge graphs have a wide range of applications, including improving search results, answering questions, providing recommendations, and developing explainable AI systems. In essence, the purpose of this course is to provide a comprehensive overview of knowledge graphs, their underlying technologies, and their significance in today's digital world.
·open.hpi.de·
Knowledge Graphs - Foundations and Applications
knowledge graph based RAG (retrieval-augmentation) consistently improves language model accuracy, this time in biomedical questions
knowledge graph based RAG (retrieval-augmentation) consistently improves language model accuracy, this time in biomedical questions
The evidence for the massive impact of KGs in NLQ keeps piling up - Here's one more paper that shows that knowledge graph based RAG (retrieval-augmentation)…
knowledge graph based RAG (retrieval-augmentation) consistently improves language model accuracy, this time in biomedical questions
·linkedin.com·
knowledge graph based RAG (retrieval-augmentation) consistently improves language model accuracy, this time in biomedical questions
Co-operative Graph Neural Networks
Co-operative Graph Neural Networks
A new message-passing paradigm where every node can choose to either ‘listen’, ‘broadcast’, ‘listen & broadcast’ or ‘isolate’.
·towardsdatascience.com·
Co-operative Graph Neural Networks