GraphNews

3951 bookmarks
Custom sorting
Beyond Transduction: A Survey on Inductive, Few Shot, and Zero Shot Link Prediction in Knowledge Graphs
Beyond Transduction: A Survey on Inductive, Few Shot, and Zero Shot Link Prediction in Knowledge Graphs
Knowledge graphs (KGs) comprise entities interconnected by relations of different semantic meanings. KGs are being used in a wide range of applications. However, they inherently suffer from incompleteness, i.e. entities or facts about entities are missing. Consequently, a larger body of works focuses on the completion of missing information in KGs, which is commonly referred to as link prediction (LP). This task has traditionally and extensively been studied in the transductive setting, where all entities and relations in the testing set are observed during training. Recently, several works have tackled the LP task under more challenging settings, where entities and relations in the test set may be unobserved during training, or appear in only a few facts. These works are known as inductive, few-shot, and zero-shot link prediction. In this work, we conduct a systematic review of existing works in this area. A thorough analysis leads us to point out the undesirable existence of diverging terminologies and task definitions for the aforementioned settings, which further limits the possibility of comparison between recent works. We consequently aim at dissecting each setting thoroughly, attempting to reveal its intrinsic characteristics. A unifying nomenclature is ultimately proposed to refer to each of them in a simple and consistent manner.
·arxiv.org·
Beyond Transduction: A Survey on Inductive, Few Shot, and Zero Shot Link Prediction in Knowledge Graphs
Large Language Models on Graphs: A Comprehensive Survey
Large Language Models on Graphs: A Comprehensive Survey
Large language models (LLMs), such as ChatGPT and LLaMA, are creating significant advancements in natural language processing, due to their strong text encoding/decoding ability and newly found emergent capability (e.g., reasoning). While LLMs are mainly designed to process pure texts, there are many real-world scenarios where text data are associated with rich structure information in the form of graphs (e.g., academic networks, and e-commerce networks) or scenarios where graph data are paired with rich textual information (e.g., molecules with descriptions). Besides, although LLMs have shown their pure text-based reasoning ability, it is underexplored whether such ability can be generalized to graph scenarios (i.e., graph-based reasoning). In this paper, we provide a systematic review of scenarios and techniques related to large language models on graphs. We first summarize potential scenarios of adopting LLMs on graphs into three categories, namely pure graphs, text-rich graphs, and text-paired graphs. We then discuss detailed techniques for utilizing LLMs on graphs, including LLM as Predictor, LLM as Encoder, and LLM as Aligner, and compare the advantages and disadvantages of different schools of models. Furthermore, we mention the real-world applications of such methods and summarize open-source codes and benchmark datasets. Finally, we conclude with potential future research directions in this fast-growing field. The related source can be found at https://github.com/PeterGriffinJin/Awesome-Language-Model-on-Graphs.
·arxiv.org·
Large Language Models on Graphs: A Comprehensive Survey
Convert your text into an interactive Knowledge Graph
Convert your text into an interactive Knowledge Graph
When reading lengthy or intricate texts, keeping an overview of different dependencies within the context is crucial. Traditionally, humans achieve this through note-taking or mentally creating a concept map. Now imagine having AI at hand which generates such a map for you. Even better, the…
·ai-readiness.ch·
Convert your text into an interactive Knowledge Graph
Scaling deep learning for materials discovery
Scaling deep learning for materials discovery
Nature - A protocol using large-scale training of graph networks enables high-throughput discovery of novel stable structures and led to the identification of 2.2 million crystal structures, of...
Novel functional materials enable fundamental breakthroughs across technological applications from clean energy to information processing1,2,3,4,5,6,7,8,9,10,11. From microchips to batteries and photovoltaics, discovery of inorganic crystals has been bottlenecked by expensive trial-and-error approaches. Concurrently, deep-learning models for language, vision and biology have showcased emergent predictive capabilities with increasing data and computation12,13,14. Here we show that graph networks trained at scale can reach unprecedented levels of generalization, improving the efficiency of materials discovery by an order of magnitude.
·nature.com·
Scaling deep learning for materials discovery
On the Multiple Roles of Ontologies in Explainable AI
On the Multiple Roles of Ontologies in Explainable AI
This paper discusses the different roles that explicit knowledge, in particular ontologies, can play in Explainable AI and in the development of human-centric explainable systems and intelligible explanations. We consider three main perspectives in which ontologies can contribute significantly, namely reference modelling, common-sense reasoning, and knowledge refinement and complexity management. We overview some of the existing approaches in the literature, and we position them according to these three proposed perspectives. The paper concludes by discussing what challenges still need to be addressed to enable ontology-based approaches to explanation and to evaluate their human-understandability and effectiveness.
·arxiv.org·
On the Multiple Roles of Ontologies in Explainable AI
Relational Deep Learning
Relational Deep Learning
A new research area that generalizes graph machine learning and broadens its applicability to a wide set of #AI use cases
·relbench.stanford.edu·
Relational Deep Learning
Power Tools for Powerful Knowledge Graphs | LinkedIn
Power Tools for Powerful Knowledge Graphs | LinkedIn
Knowledge graphs and ontologies, like any other mission-critical resource, require a laser-sharp focus on architecture, accuracy, and coverage. Essentially all downstream processes rely on them, as does the overall success of any knowledge-centric AI product or company.
·linkedin.com·
Power Tools for Powerful Knowledge Graphs | LinkedIn
Common sense knowledge graphs are slightly different from conventional knowledge graphs, but they share the most important thing: they both capture explicit symbolic knowledge
Common sense knowledge graphs are slightly different from conventional knowledge graphs, but they share the most important thing: they both capture explicit symbolic knowledge
I really enjoyed the latest #UnconfuseMe with Bill Gates and Yejin Choi.  Yejin's research is on symbolic knowledge distillation, which means they take large…
Common sense knowledge graphs are slightly different from conventional knowledge graphs, but they share the most important thing: they both capture explicit symbolic knowledge
·linkedin.com·
Common sense knowledge graphs are slightly different from conventional knowledge graphs, but they share the most important thing: they both capture explicit symbolic knowledge
A Survey of Graph Meets Large Language Model: Progress and Future Directions
A Survey of Graph Meets Large Language Model: Progress and Future Directions
Graph plays a significant role in representing and analyzing complex relationships in real-world applications such as citation networks, social networks, and biological data. Recently, Large Language Models (LLMs), which have achieved tremendous success in various domains, have also been leveraged in graph-related tasks to surpass traditional Graph Neural Networks (GNNs) based methods and yield state-of-the-art performance. In this survey, we first present a comprehensive review and analysis of existing methods that integrate LLMs with graphs. First of all, we propose a new taxonomy, which organizes existing methods into three categories based on the role (i.e., enhancer, predictor, and alignment component) played by LLMs in graph-related tasks. Then we systematically survey the representative methods along the three categories of the taxonomy. Finally, we discuss the remaining limitations of existing studies and highlight promising avenues for future research. The relevant papers are summarized and will be consistently updated at: https://github.com/yhLeeee/Awesome-LLMs-in-Graph-tasks.
·arxiv.org·
A Survey of Graph Meets Large Language Model: Progress and Future Directions
Fast-track graph ML with GraphStorm: A new way to solve problems on enterprise-scale graphs | Amazon Web Services
Fast-track graph ML with GraphStorm: A new way to solve problems on enterprise-scale graphs | Amazon Web Services
We are excited to announce the open-source release of GraphStorm 0.1, a low-code enterprise graph machine learning (ML) framework to build, train, and deploy graph ML solutions on complex enterprise-scale graphs in days instead of months. With GraphStorm, you can build solutions that directly take into account the structure of relationships or interactions between billions […]
·aws.amazon.com·
Fast-track graph ML with GraphStorm: A new way to solve problems on enterprise-scale graphs | Amazon Web Services
VGAE-MCTS: A New Molecular Generative Model Combining the Variational Graph Auto-Encoder and Monte Carlo Tree Search
VGAE-MCTS: A New Molecular Generative Model Combining the Variational Graph Auto-Encoder and Monte Carlo Tree Search
Molecular generation is crucial for advancing drug discovery, materials science, and chemical exploration. It expedites the search for new drug candidates, facilitates tailored material creation, and enhances our understanding of molecular diversity. By employing artificial intelligence techniques such as molecular generative models based on molecular graphs, researchers have tackled the challenge of identifying efficient molecules with desired properties. Here, we propose a new molecular generative model combining a graph-based deep neural network and a reinforcement learning technique. We evaluated the validity, novelty, and optimized physicochemical properties of the generated molecules. Importantly, the model explored uncharted regions of chemical space, allowing for the efficient discovery and design of new molecules. This innovative approach has considerable potential to revolutionize drug discovery, materials science, and chemical research for accelerating scientific innovation. By leveraging advanced techniques and exploring previously unexplored chemical spaces, this study offers promising prospects for the efficient discovery and design of new molecules in the field of drug development.
·pubs.acs.org·
VGAE-MCTS: A New Molecular Generative Model Combining the Variational Graph Auto-Encoder and Monte Carlo Tree Search
Relevant Entity Selection: Knowledge Graph Bootstrapping via Zero-Shot Analogical Pruning
Relevant Entity Selection: Knowledge Graph Bootstrapping via Zero-Shot Analogical Pruning
Knowledge Graph Construction (KGC) can be seen as an iterative process starting from a high quality nucleus that is refined by knowledge extraction approaches in a virtuous loop. Such a nucleus can be obtained from knowledge existing in an open KG like Wikidata. However, due to the size of such generic KGs, integrating them as a whole may entail irrelevant content and scalability issues. We propose an analogy-based approach that starts from seed entities of interest in a generic KG, and keeps or prunes their neighboring entities. We evaluate our approach on Wikidata through two manually labeled datasets that contain either domain-homogeneous or -heterogeneous seed entities. We empirically show that our analogy-based approach outperforms LSTM, Random Forest, SVM, and MLP, with a drastically lower number of parameters. We also evaluate its generalization potential in a transfer learning setting. These results advocate for the further integration of analogy-based inference in tasks related to the KG lifecycle.
·arxiv.org·
Relevant Entity Selection: Knowledge Graph Bootstrapping via Zero-Shot Analogical Pruning
VloGraph: A Virtual Knowledge Graph Framework for Distributed Security Log Analysis
VloGraph: A Virtual Knowledge Graph Framework for Distributed Security Log Analysis
The integration of heterogeneous and weakly linked log data poses a major challenge in many log-analytic applications. Knowledge graphs (KGs) can facilitate such integration by providing a versatile representation that can interlink objects of interest and enrich log events with background knowledge. Furthermore, graph-pattern based query languages, such as SPARQL, can support rich log analyses by leveraging semantic relationships between objects in heterogeneous log streams. Constructing, materializing, and maintaining centralized log knowledge graphs, however, poses significant challenges. To tackle this issue, we propose VloGraph—a distributed and virtualized alternative to centralized log knowledge graph construction. The proposed approach does not involve any a priori parsing, aggregation, and processing of log data, but dynamically constructs a virtual log KG from heterogeneous raw log sources across multiple hosts. To explore the feasibility of this approach, we developed a prototype and demonstrate its applicability to three scenarios. Furthermore, we evaluate the approach in various experimental settings with multiple heterogeneous log sources and machines; the encouraging results from this evaluation suggest that the approach can enable efficient graph-based ad-hoc log analyses in federated settings.
·mdpi.com·
VloGraph: A Virtual Knowledge Graph Framework for Distributed Security Log Analysis