yfiles jupyter graphs for sparql: The open-source adapter for working with RDF databases
๐ฃHey Semantic Web/SPARQL/RDF/OWL/Knowledge graph community:
Finally! We heard you! I just got this fresh from the dev kitchen: ๐
Try our free SPARQL query result visualization widget for Jupyter Notebooks!
Based on our popular generic graph visualization widget for Jupyter, this widget makes it super convenient to add beautiful graph visualizations of your SPARQL queries to your Jupyter Notebooks.
Check out the example notebooks for Google Colab in the GitHub repo
https://lnkd.in/e8JP-eiM โจ
This is a pre-1.0-release but already quite capable, as it builds on the well-tested generic widget. We are looking to get your feedback on the features for the final release, so please do take a look and let me know your feedback here, or tell us on GitHub!
What features are you missing? What do you like best about the widget? Let me know in the comments and I'll talk to the devs ๐
#sparql #rdf #owl #semanticweb #knowledgegraphs #visualization
GitHub - yWorks/yfiles-jupyter-graphs-for-sparql: The open-source adapter for working with RDF databas
I have never been a fan of the "bubble and arrows" kind of graph visualizations. It is generaly useless.
But when you can see the entire graph, and can tune the rendering, you start understanding the topology and structure - and ultimately you can tell a story with your graph (and that's what we all love, stories).
Gephi is a graph visualization tool to tell these sort of stories with graphs, that has been around for 15 (20 ?) years. Interestingly, while quite a number of Gephi plugins exist to load data (including from neo4j), no decent working plugin exist to load RDF data (yes, there was a "SemanticWebImport" plugin, but it looks outdated, with an old documentation, and does not work with latest - 0.10 - version of Gephi). This doesn't tell anything good for the semantic knowledge graph community.
A few weeks ago I literally stumbled upon an old project we developed in 2017 to convert RDF graphs into the GEXF format that can be loaded in Gephi. Time for a serious cleaning, reengineering, and packaging ! So here is a v1.0.0 of the rebranded rdf2gephi utility tool !
The tool runs as a command line that can read an RDF knowledge graph (from files or a SPARQL endpoint), execute a set of SPARQL queries, and turn that into a set of nodes and edges in a GEXF file. rdf2gephi provides default queries to run a simple conversion without any parameters, but most of the time you will want to tune how your graph is turned into GEXF nodes and edges (for example, in my case, `org:Membership` entities relating `foaf:Persons` with `org:Organizations` are not turned into nodes, but into edges, and I want to ignore some other entities).
And then what ? then you can load the GEXF file in Gephi, and run a few operations to showcase your graph (see the little screencast video I recorded) : run a layout algorithm, color nodes based on their rdf:type or another attribute you converted, change their size according to the (in-)degree, detect clusters based on a modularity algorithm, etc. etc. - and then export as SVG, PNG, or another format. Also, one of the cool feature supported by the GEXF format are dynamic graphs, where each nodes and edges can be associated to a date range. You can then see your graph evolving through time, like in a movie !
I hope I will be able to tell a more concrete Gephi-powered, RDF-backed graph-story in a future post !
All links in comments.
Apache Parquet is an important part of the mainstream data stack. It provides a space-efficient, widely-supported way to exchange tabular data that can be used directly by various query engines.
Knowledge Graph Research Report 2025: Global Market to Reach $6.93 Billion by 2030 from $1.06 Billion in 2024, Growing at a CAGR of 36.6% - Changing for Organizations Deal with Large Datasets
GFM-RAG: The First Graph Foundation Model for Retrieval-Augmented Generation
๐ Introducing GFM-RAG: The First Graph Foundation Model for Retrieval-Augmented Generation!
Weโre excited to share our latest research: GFM-RAG: Graphโฆ | 20 comments on LinkedIn
GFM-RAG: The First Graph Foundation Model for Retrieval-Augmented Generation
Specifications to define data assets managed as products
๐ In recent years, several specifications have emerged to define data assets managed as products. Today, two main types of specifications exist:
1๏ธโฃ ๐๐ฎ๐๐ฎ ๐๐ผ๐ป๐๐ฟ๐ฎ๐ฐ๐ ๐ฆ๐ฝ๐ฒ๐ฐ๐ถ๐ณ๐ถ๐ฐ๐ฎ๐๐ถ๐ผ๐ป (๐๐๐ฆ): Focused on describing the data asset and its associated metadata.
2๏ธโฃ ๐๐ฎ๐๐ฎ ๐ฃ๐ฟ๐ผ๐ฑ๐๐ฐ๐ ๐ฆ๐ฝ๐ฒ๐ฐ๐ถ๐ณ๐ถ๐ฐ๐ฎ๐๐ถ๐ผ๐ป (๐๐ฃ๐ฆ): Focused on describing the data product that manages and exposes the data asset.
๐ The ๐ข๐ฝ๐ฒ๐ป ๐๐ฎ๐๐ฎ ๐๐ผ๐ป๐๐ฟ๐ฎ๐ฐ๐ ๐ฆ๐๐ฎ๐ป๐ฑ๐ฎ๐ฟ๐ฑ (๐ข๐๐๐ฆ) by Bitol is an example of the first specification type, while the ๐๐ฎ๐๐ฎ ๐ฃ๐ฟ๐ผ๐ฑ๐๐ฐ๐ ๐๐ฒ๐๐ฐ๐ฟ๐ถ๐ฝ๐๐ผ๐ฟ ๐ฆ๐ฝ๐ฒ๐ฐ๐ถ๐ณ๐ถ๐ฐ๐ฎ๐๐ถ๐ผ๐ป (๐๐ฃ๐๐ฆ) by the Open Data Mesh Initiative represents the second.
๐ค But what are the key differences between these two approaches? Where do they overlap, and how can they complement each other? More broadly, are they friends, enemies, or frenemies?
๐ I explored these questions in my latest blog post. The image below might give away some spoilers, but if you're curious about the full reasoning, read the post.
โค๏ธ I'd love to hear your thoughts!
#TheDataJoy #DataContracts #DataProducts #DataGovernance | 29 comments on LinkedIn
specifications have emerged to define data assets managed as products
SymAgent: A Neural-Symbolic Self-Learning Agent Framework for Complex Reasoning over Knowledge Graphs
LLMs that automatically fill knowledge gaps - too good to be true?
Large Language Models (LLMs) often stumble in logical tasks due to hallucinations, especially when relying on incomplete Knowledge Graphs (KGs).
Current methods naively trust KGs as exhaustive truth sources - a flawed assumption in real-world domains like healthcare or finance where gaps persist.
SymAgent is a new framework that approaches this problem by makingย KGs active collaborators, not passive databases.
Its dual-module design combines symbolic logic with neural flexibility:
1. Agent-Plannerย extracts implicit rules from KGs (e.g., "If drug X interacts with Y, avoid co-prescription") to decompose complex questions into structured steps.
2. Agent-Executorย dynamically pulls external data when KG triples are missing, bypassing the "static repository" limitation.
Perhaps most impressively, SymAgentโs self-learning observes failed reasoning paths to iteratively refine its strategyย andย flag missing KG connections - achieving 20-30% accuracy gains over raw LLMs.
Equipped with SymAgent, even 7B models rival their much larger counterparts by leveraging this closed-loop system.
It would be great if LLMs were able to autonomously curate knowledge and adapt to domain shifts without costly retraining.
But are we there yet? Are hybrid architectures like SymAgent the future?
โ
Liked this post? Join my newsletter with 50k+ readers that breaks down all you need to know about the latest LLM research: llmwatch.com ๐ก
What are the key ontology standards you should have in mind?
Ontology standards are crucial for knowledge representation and reasoning in AI and dataโฆ | 32 comments on LinkedIn
Dynamic Reasoning Graphs + LLMs = ๐ค
Large Language Models (LLMs) often stumble on complex tasks when confined to linear reasoning.
What if they couldโฆ | 10 comments on LinkedIn
And so we set out to understand _feedforward_ graphs (i.e. graphs w/o back edges) โฉ
Turns out these graphs are rather understudied for how often they areโฆ
๐ Pathway to Artificial General Intelligence (AGI) ๐ This is my view on the evolutionary steps toward AGI: 1๏ธโฃ Large Language Models (LLMs): Language modelsโฆ
We're very happy to announce our latest release of Kรนzu, version 0.8.0, is now available and ready to use! This release brings an exciting new feature thatโฆ
Nakala : from an RDF dataset to a query UI in minutes - SHACL automated generation and Sparnatural - Sparna Blog
Here is a usecase of an automated version of Sparnatural submitted as an example for Veronika Heimsbakkโs SHACL for theย Practitionerย upcoming book about the Shapes Constraint Language (SHACL). โ The Sparnatural knowledge graph explorerย leverages SHACL specifications to drive a user interface (UI) that allows end users to easily discover the content of an RDF graph. Whatโฆ
Tana snaps up $25M as its AI-powered knowledge graph for work racks up a 160K+ waitlist | TechCrunch
An app that helps people and teams in the working world simplify their to-do lists โ ideally by organizing and doing some of the work for them โ has
In my last post, AI Supported Taxonomy Term Generation, I used an LLM to help generate candidate terms for the revision of a topic taxonomy that had fallen out of sync with the content it was meant to tag. In that example, the taxonomy in question is for the "Insights" articles on my consulting webs
Ontology is not only about data! Many people think that ontologies are only about data (information). But an information model provides only one perspectiveโฆ | 85 comments on LinkedIn
GitHub - apache/incubator-hugegraph: A graph database that supports more than 100+ billion data, high performance and scalability (Include OLTP Engine & REST-API & Backends)
A graph database that supports more than 100+ billion data, high performance and scalability (Include OLTP Engine & REST-API & Backends) - apache/incubator-hugegraph
Organisations have oceans of data, but most remains siloed, fragmented, and underutilized. Enterprise Knowledge Graphs are a practical and scalable solutionโฆ
How crazy is it that over 20 years ago, Berners-Lee, Hendler, and Lassila laid out a vision in 'The Semantic Web' that we're only now fully appreciating as theโฆ | 24 comments on LinkedIn
Bluesky starter pack on KGs and Semantic Web Technologies
500 million+ members | Manage your professional identity. Build and engage with your professional network. Access knowledge, insights and opportunities.
Bluesky, I created a starter pack on KGs and Semantic Web Technologies