GraphNews

4336 bookmarks
Custom sorting
Tony Seale Knowledge Graph Chatbot
Tony Seale Knowledge Graph Chatbot
I am thrilled to introduce a new AI Study Guide (https://lnkd.in/g4rPZVHW) dedicated to Tony Seale, another of my favorite authors, thought leaders, and…
Knowledge Graph
·linkedin.com·
Tony Seale Knowledge Graph Chatbot
PyGraft: Configurable Generation of Synthetic Schemas and Knowledge Graphs at Your Fingertips
PyGraft: Configurable Generation of Synthetic Schemas and Knowledge Graphs at Your Fingertips
Knowledge graphs (KGs) have emerged as a prominent data representation and management paradigm. Being usually underpinned by a schema (e.g., an ontology), KGs capture not only factual information but also contextual knowledge. In some tasks, a few KGs established themselves as standard benchmarks. However, recent works outline that relying on a limited collection of datasets is not sufficient to assess the generalization capability of an approach. In some data-sensitive fields such as education or medicine, access to public datasets is even more limited. To remedy the aforementioned issues, we release PyGraft, a Python-based tool that generates highly customized, domain-agnostic schemas and KGs. The synthesized schemas encompass various RDFS and OWL constructs, while the synthesized KGs emulate the characteristics and scale of real-world KGs. Logical consistency of the generated resources is ultimately ensured by running a description logic (DL) reasoner. By providing a way of generating both a schema and KG in a single pipeline, PyGraft's aim is to empower the generation of a more diverse array of KGs for benchmarking novel approaches in areas such as graph-based machine learning (ML), or more generally KG processing. In graph-based ML in particular, this should foster a more holistic evaluation of model performance and generalization capability, thereby going beyond the limited collection of available benchmarks. PyGraft is available at: https://github.com/nicolas-hbt/pygraft.
·arxiv.org·
PyGraft: Configurable Generation of Synthetic Schemas and Knowledge Graphs at Your Fingertips
KGLM-Loop: A Bi-Directional Data Flywheel for Knowledge Graph Refinement and Hallucination Detection in Large Language Models
KGLM-Loop: A Bi-Directional Data Flywheel for Knowledge Graph Refinement and Hallucination Detection in Large Language Models
KGLM-Loop: A Bi-Directional Data Flywheel for Knowledge Graph Refinement and Hallucination Detection in Large Language Models ☀ 🌑 In the pursuit of…
KGLM-Loop: A Bi-Directional Data Flywheel for Knowledge Graph Refinement and Hallucination Detection in Large Language Models
·linkedin.com·
KGLM-Loop: A Bi-Directional Data Flywheel for Knowledge Graph Refinement and Hallucination Detection in Large Language Models
Decoding the Semantic Layer
Decoding the Semantic Layer
We've been hearing the term "Semantic layer" without truly understanding the semantics of it. So, here is episode 11 of #DnABytes and today's topic is:…
Decoding the Semantic Layer
·linkedin.com·
Decoding the Semantic Layer
Telecom GenAI based Network Operations: The Integration of LLMs, GraphRAG, Reinforcement Learning, and Scoring Models
Telecom GenAI based Network Operations: The Integration of LLMs, GraphRAG, Reinforcement Learning, and Scoring Models
Telecom GenAI based Network Operations: The Integration of LLMs, GraphRAG, Reinforcement Learning, and Scoring Models With the increasing complexity of… | 12 comments on LinkedIn
Telecom GenAI based Network Operations: The Integration of LLMs, GraphRAG, Reinforcement Learning, and Scoring Models
·linkedin.com·
Telecom GenAI based Network Operations: The Integration of LLMs, GraphRAG, Reinforcement Learning, and Scoring Models
Why do LangChain and Autogen use graphs? Here are the top reasons
Why do LangChain and Autogen use graphs? Here are the top reasons
LLM frameworks like LangChain are moving towards a graph-based approach for handling their workflows. This represents the initial steps of a much larger… | 90 comments on LinkedIn
Why do LangChain and Autogen use graphs? Here are the top reasons
·linkedin.com·
Why do LangChain and Autogen use graphs? Here are the top reasons
The latest in GNN technology - PyG 2.5
The latest in GNN technology - PyG 2.5
🚀 Join us for a special webinar on March 6th, 8am PT/5pm CET, as we unveil the latest in GNN technology - PyG 2.5! 🎉 Dive deep into the features with a live…
the latest in GNN technology - PyG 2.5
·linkedin.com·
The latest in GNN technology - PyG 2.5
Neural Scaling Laws on Graphs
Neural Scaling Laws on Graphs
Deep graph models (e.g., graph neural networks and graph transformers) have become important techniques for leveraging knowledge across various types of graphs. Yet, the scaling properties of deep graph models have not been systematically investigated, casting doubt on the feasibility of achieving large graph models through enlarging the model and dataset sizes. In this work, we delve into neural scaling laws on graphs from both model and data perspectives. We first verify the validity of such laws on graphs, establishing formulations to describe the scaling behaviors. For model scaling, we investigate the phenomenon of scaling law collapse and identify overfitting as the potential reason. Moreover, we reveal that the model depth of deep graph models can impact the model scaling behaviors, which differ from observations in other domains such as CV and NLP. For data scaling, we suggest that the number of graphs can not effectively metric the graph data volume in scaling law since the sizes of different graphs are highly irregular. Instead, we reform the data scaling law with the number of edges as the metric to address the irregular graph sizes. We further demonstrate the reformed law offers a unified view of the data scaling behaviors for various fundamental graph tasks including node classification, link prediction, and graph classification. This work provides valuable insights into neural scaling laws on graphs, which can serve as an essential step toward large graph models.
·arxiv.org·
Neural Scaling Laws on Graphs
Data provenance woth PROV-O
Data provenance woth PROV-O
Data provenance is something people love in theory, but never practice... I have just rewatched an excellent appearance by Jaron Lanier from a couple of… | 52 comments on LinkedIn
Data provenance
·linkedin.com·
Data provenance woth PROV-O
Future Directions in Foundations of Graph Machine Learning
Future Directions in Foundations of Graph Machine Learning
Machine learning on graphs, especially using graph neural networks (GNNs), has seen a surge in interest due to the wide availability of graph data across a broad spectrum of disciplines, from life to social and engineering sciences. Despite their practical success, our theoretical understanding of the properties of GNNs remains highly incomplete. Recent theoretical advancements primarily focus on elucidating the coarse-grained expressive power of GNNs, predominantly employing combinatorial techniques. However, these studies do not perfectly align with practice, particularly in understanding the generalization behavior of GNNs when trained with stochastic first-order optimization techniques. In this position paper, we argue that the graph machine learning community needs to shift its attention to developing a more balanced theory of graph machine learning, focusing on a more thorough understanding of the interplay of expressive power, generalization, and optimization.
·arxiv.org·
Future Directions in Foundations of Graph Machine Learning
Series A Announcement | Orbital Materials
Series A Announcement | Orbital Materials
Orbital Materials (founded by ex-DeepMind researchers) raised $16M Series A led by Radical Ventures and Toyota Ventures. OM focuses on materials science and shed some light on LINUS - the in-house 3D foundation model for material design (apparently, an ML potential and a generative model) with the ambition to become the AlphaFold of materials science. GNNs = 💸
·orbitalmaterials.com·
Series A Announcement | Orbital Materials
MLX-graphs — mlx-graphs 0.0.3 documentation
MLX-graphs — mlx-graphs 0.0.3 documentation
Apple presented MLX-graphs: the GNN library for the MLX framework specifically optimized for Apple Silicon. Since the CPU/GPU memory is shared on M1/M2/M3, you don’t have to worry about moving tensors around and at the same time you can enjoy massive GPU memory of latest M2/M3 chips (64 GB MBPs and MacMinis are still much cheaper than A100 80 GB). For starters, MLX-graphs includes GCN, GAT, GIN, GraphSAGE, and MPNN models and a few standard datasets.
·mlx-graphs.github.io·
MLX-graphs — mlx-graphs 0.0.3 documentation
Artificial Intelligence for Complex Network: Potential, Methodology and Application
Artificial Intelligence for Complex Network: Potential, Methodology and Application
Complex networks pervade various real-world systems, from the natural environment to human societies. The essence of these networks is in their ability to transition and evolve from microscopic disorder-where network topology and node dynamics intertwine-to a macroscopic order characterized by certain collective behaviors. Over the past two decades, complex network science has significantly enhanced our understanding of the statistical mechanics, structures, and dynamics underlying real-world networks. Despite these advancements, there remain considerable challenges in exploring more realistic systems and enhancing practical applications. The emergence of artificial intelligence (AI) technologies, coupled with the abundance of diverse real-world network data, has heralded a new era in complex network science research. This survey aims to systematically address the potential advantages of AI in overcoming the lingering challenges of complex network research. It endeavors to summarize the pivotal research problems and provide an exhaustive review of the corresponding methodologies and applications. Through this comprehensive survey-the first of its kind on AI for complex networks-we expect to provide valuable insights that will drive further research and advancement in this interdisciplinary field.
·arxiv.org·
Artificial Intelligence for Complex Network: Potential, Methodology and Application
Data provenance with PROV-O
Data provenance with PROV-O
Data provenance is something people love in theory, but never practice... I have just rewatched an excellent appearance by Jaron Lanier from a couple of… | 39 comments on LinkedIn
Data provenance
·linkedin.com·
Data provenance with PROV-O
Relational Harmony and a New Hope for Dimensionality
Relational Harmony and a New Hope for Dimensionality
Relational Harmony and a New Hope for Dimensionality ⛓️ In an era where data complexity escalates and the quest for meaningful technology integration… | 30 comments on LinkedIn
Relational Harmony and a New Hope for Dimensionality
·linkedin.com·
Relational Harmony and a New Hope for Dimensionality
Knowledge Graphs: Today's triples just ain't enough | LinkedIn
Knowledge Graphs: Today's triples just ain't enough | LinkedIn
Knowledge hypergraphs are garnering a lot of attention – and deservedly so. You can find my two previous posts on knowledge hypergraphs and on more adaptive conceptualizations for hypergraphs as well as Kurt Cagle's focused and more practically minded pieces on Hypergraphs and RDF, on Named Graphs (
·linkedin.com·
Knowledge Graphs: Today's triples just ain't enough | LinkedIn
Knowledge Engineering using Large Language Models
Knowledge Engineering using Large Language Models
Knowledge engineering is a discipline that focuses on the creation and maintenance of processes that generate and apply knowledge. Traditionally, knowledge engineering approaches have focused on knowledge expressed in formal languages. The emergence of large language models and their capabilities to effectively work with natural language, in its broadest sense, raises questions about the foundations and practice of knowledge engineering. Here, we outline the potential role of LLMs in knowledge engineering, identifying two central directions: 1) creating hybrid neuro-symbolic knowledge systems; and 2) enabling knowledge engineering in natural language. Additionally, we formulate key open research questions to tackle these directions.
·arxiv.org·
Knowledge Engineering using Large Language Models
Global Knowledge Graph Market by Offering (Solutions, Services), By Data Source (Structured, Unstructured, Semi-structured), Industry (BFSI, IT & ITeS, Telecom, Healthcare), Model Type, Application, Type and Region - Forecast to 2028
Global Knowledge Graph Market by Offering (Solutions, Services), By Data Source (Structured, Unstructured, Semi-structured), Industry (BFSI, IT & ITeS, Telecom, Healthcare), Model Type, Application, Type and Region - Forecast to 2028
Rapid Growth in Data Volume and Complexity
·researchandmarkets.com·
Global Knowledge Graph Market by Offering (Solutions, Services), By Data Source (Structured, Unstructured, Semi-structured), Industry (BFSI, IT & ITeS, Telecom, Healthcare), Model Type, Application, Type and Region - Forecast to 2028
Chatbot created based on the prolific writings of Mike Dillinger. This chatbot helps you better digest his posts and articles on Knowledge Graphs, Taxonomy, Ontology and their critical roles in getting LLM technology more accurate and practical
Chatbot created based on the prolific writings of Mike Dillinger. This chatbot helps you better digest his posts and articles on Knowledge Graphs, Taxonomy, Ontology and their critical roles in getting LLM technology more accurate and practical
Check out this chatbot (https://lnkd.in/gv8Afk57) that I created entirely based on the prolific writings of Mike Dillinger, PhD . This chatbot helps you better…
created entirely based on the prolific writings of Mike Dillinger, PhD . This chatbot helps you better digest his posts and articles on Knowledge Graphs, Taxonomy, Ontology and their critical roles in getting LLM technology more accurate and practical
·linkedin.com·
Chatbot created based on the prolific writings of Mike Dillinger. This chatbot helps you better digest his posts and articles on Knowledge Graphs, Taxonomy, Ontology and their critical roles in getting LLM technology more accurate and practical