GraphNews

4094 bookmarks
Custom sorting
Architecting Solid Foundations for Scalable Knowledge Graphs | LinkedIn
Architecting Solid Foundations for Scalable Knowledge Graphs | LinkedIn
Whether we remember them or not, we rely directly on unexamined and often very murky foundational assumptions that permeate everything we do. These assumptions are formulated using keystone concepts – core concepts that are so crucial that mere dictionary-style definitions are not enough.
·linkedin.com·
Architecting Solid Foundations for Scalable Knowledge Graphs | LinkedIn
Knowledge Graphs Achieve Superior Reasoning versus Vector Search alone for Retrieval Augmentation
Knowledge Graphs Achieve Superior Reasoning versus Vector Search alone for Retrieval Augmentation
Knowledge Graphs Achieve Superior Reasoning versus Vector Search alone for Retrieval Augmentation 🔗 As artificial intelligence permeates business… | 29 comments on LinkedIn
Knowledge Graphs Achieve Superior Reasoning versus Vector Search alone for Retrieval Augmentation
·linkedin.com·
Knowledge Graphs Achieve Superior Reasoning versus Vector Search alone for Retrieval Augmentation
Semantic Random Walk for Graph Representation Learning in Attributed Graphs
Semantic Random Walk for Graph Representation Learning in Attributed Graphs
For so many papers about graph ML, I find that they start with a graph theoretic-ish G = (V, E) perspective, and ignore the fact that in production people are working with a semantic layer atop the labels (IRI), or with properties (aka, "attributes"). This is, quite frankly, the first time that I've encountered a paper which begins with formalisms that cover both labeled property graphs (LPGs) and semantic inference.
·arxiv.org·
Semantic Random Walk for Graph Representation Learning in Attributed Graphs
Graph analytics for a new kind of economic analysis: Measures of the Capital Network of the U.S. Economy
Graph analytics for a new kind of economic analysis: Measures of the Capital Network of the U.S. Economy
500 million+ members | Manage your professional identity. Build and engage with your professional network. Access knowledge, insights and opportunities.
graph analytics for a new kind of economic analysis:"Measures of the Capital Network of the U.S. Economy"
·linkedin.com·
Graph analytics for a new kind of economic analysis: Measures of the Capital Network of the U.S. Economy
Learning to Count Isomorphisms with Graph Neural Networks
Learning to Count Isomorphisms with Graph Neural Networks
Subgraph isomorphism counting is an important problem on graphs, as many graph-based tasks exploit recurring subgraph patterns. Classical methods usually boil down to a backtracking framework that needs to navigate a huge search space with prohibitive computational costs. Some recent studies resort to graph neural networks (GNNs) to learn a low-dimensional representation for both the query and input graphs, in order to predict the number of subgraph isomorphisms on the input graph. However, typical GNNs employ a node-centric message passing scheme that receives and aggregates messages on nodes, which is inadequate in complex structure matching for isomorphism counting. Moreover, on an input graph, the space of possible query graphs is enormous, and different parts of the input graph will be triggered to match different queries. Thus, expecting a fixed representation of the input graph to match diversely structured query graphs is unrealistic. In this paper, we propose a novel GNN called Count-GNN for subgraph isomorphism counting, to deal with the above challenges. At the edge level, given that an edge is an atomic unit of encoding graph structures, we propose an edge-centric message passing scheme, where messages on edges are propagated and aggregated based on the edge adjacency to preserve fine-grained structural information. At the graph level, we modulate the input graph representation conditioned on the query, so that the input graph can be adapted to each query individually to improve their matching. Finally, we conduct extensive experiments on a number of benchmark datasets to demonstrate the superior performance of Count-GNN.
·arxiv.org·
Learning to Count Isomorphisms with Graph Neural Networks
Relational Deep Learning: Graph Representation Learning on Relational Databases
Relational Deep Learning: Graph Representation Learning on Relational Databases
Much of the world's most valued data is stored in relational databases and data warehouses, where the data is organized into many tables connected by primary-foreign key relations. However, building machine learning models using this data is both challenging and time consuming. The core problem is that no machine learning method is capable of learning on multiple tables interconnected by primary-foreign key relations. Current methods can only learn from a single table, so the data must first be manually joined and aggregated into a single training table, the process known as feature engineering. Feature engineering is slow, error prone and leads to suboptimal models. Here we introduce an end-to-end deep representation learning approach to directly learn on data laid out across multiple tables. We name our approach Relational Deep Learning (RDL). The core idea is to view relational databases as a temporal, heterogeneous graph, with a node for each row in each table, and edges specified by primary-foreign key links. Message Passing Graph Neural Networks can then automatically learn across the graph to extract representations that leverage all input data, without any manual feature engineering. Relational Deep Learning leads to more accurate models that can be built much faster. To facilitate research in this area, we develop RelBench, a set of benchmark datasets and an implementation of Relational Deep Learning. The data covers a wide spectrum, from discussions on Stack Exchange to book reviews on the Amazon Product Catalog. Overall, we define a new research area that generalizes graph machine learning and broadens its applicability to a wide set of AI use cases.
·arxiv.org·
Relational Deep Learning: Graph Representation Learning on Relational Databases
TAG-DS
TAG-DS
Welcome to Topology, Algebra, and Geometry in Data Science (TAG-DS). This site is intended to bring together researchers who are applying mathematical techniques to the rapidly growing field of data science. The three identified fields encompass more than 100-years of finely tuned machinery that
·tagds.com·
TAG-DS
Graph & Geometric ML in 2024: Where We Are and What’s Next (Part I — Theory & Architectures)
Graph & Geometric ML in 2024: Where We Are and What’s Next (Part I — Theory & Architectures)
Trends and recent advancements in Graph and Geometric Deep Learning
Following the tradition from previous years, we interviewed a cohort of distinguished and prolific academic and industrial experts in an attempt to summarise the highlights of the past year and predict what is in store for 2024. Past 2023 was so ripe with results that we had to break this post into two parts. This is Part I focusing on theory & new architectures,
·towardsdatascience.com·
Graph & Geometric ML in 2024: Where We Are and What’s Next (Part I — Theory & Architectures)