A Graph RAG (Retrieval-Augmented Generation) chat application that combines OpenAI GPT with knowledge graphs stored in GraphDB
After seeing yet another Graph RAG demo using Neo4j with no ontology, I decided to show what real semantic Graph RAG looks like.
The Problem with Most Graph RAG Demos:
Everyone's building Graph RAG with LPG databases (Neo4j, TigerGraph, Arrango etc.) and calling it "knowledge graphs." But here's the thing:
Without formal ontologies, you don't have a knowledge graphâyou just have a graph database.
The difference?
â LPG: Nodes and edges are just strings. No semantics. No reasoning. No standards.
â  RDF/SPARQL: Formal ontologies (RDFS/OWL) that define domain knowledge. Machine-readable semantics. W3C standards. Built-in reasoning.
So I Built a Real Semantic Graph RAG
Using:
- Microsoft Agent Framework - AI orchestration
- Formal ontologies - RDFS/OWL knowledge representation
- Ontotext GraphDBÂ - RDF triple store
- SPARQL - semantic querying
- GPT-5Â - ontology-aware extraction
It's all on github, a simple template as boilerplate for you project:
The "Jaguar problem":
What does "Yesterday I was hit by a Jaguar" really mean? It is impossible to know without concept awareness. To demonstrate why ontologies matter, I created a corpus with mixed content:
đ Wildlife jaguars (Panthera onca)
đ Jaguar cars (E-Type, XK-E)
đž Fender Jaguar guitars
I fed this to GPT-5 along with a jaguar conservation ontology.
The result? The LLM automatically extracted ONLY wildlife-related entitiesâfiltering out cars and guitarsâbecause it understood the semantic domain from the ontology.
No post-processing. No manual cleanup. Just intelligent, concept-aware extraction.
This is impossible with LPG databases because they lack formal semantic structure. Labels like (:Jaguar) are just stringsâthe LLM has no way to know if you mean the animal, car, or guitar.
Knowledge Graphs = "Data for AI"
LLMs don't need more dataâthey need structured, semantic data they can reason over.
That's what formal ontologies provide:
â Domain context
â Class hierarchies
â Property definitions
â Relationship semantics
â Reasoning rules
This transforms Graph RAG from keyword matching into true semantic retrieval.
Check Out the Full Implementation, the repo includes:
Complete Graph RAG implementation with Microsoft Agent Framework
Working jaguar conservation knowledge graph
Jupyter notebook: ontology-aware extraction from mixed-content text
https://lnkd.in/dmf5HDRm
And if you have gotten this far, you realize that most of this post is written by Cursor ... That goes for the code too. đ
Your Turn:
I know this is a contentious topic. Many teams are heavily invested in LPG-based Graph RAG. What are your thoughts on RDF vs. LPG for Graph RAG? Drop a comment below!
#GraphRAG #KnowledgeGraphs #SemanticWeb #RDF #SPARQL #AI #MachineLearning #LLM #Ontology #KnowledgeRepresentation #OpenSource #neo4j #graphdb #agentic-framework #ontotext #agenticai | 148 comments on LinkedIn
ATOM: AdapTive and OptiMized dynamic temporal knowledge graph construction using LLMs
â Some state-of-the-art methods for knowledge graph (KG) construction that implement incrementality build a graph from around 3k atomic facts in 4â7 hours, while ATOM achieves the same in just 20 minutes using only 8 parallel threads and a batch size of 40 for asynchronous LLM API calls.
â Whatâs the secret behind this performance?
đ The architecture. The parallel design.
â Incrementality in KG construction was key, but it significantly limits scalability. This is because the method must first build the KG and compare it with the previous one before moving on to the next chunk. Thatâs why we eliminated this in iText2KG.
â Why is scalability so important? The short answer: real-time analytics.
Fast dynamic TKG construction enables LLMs to reason over them and generate responses instantly, in real time.
Discover more secrets behind this parallel architecture by reading the full paper (link in the first comment).
ATOM: AdapTive and OptiMized dynamic temporal knowledge graph construction using LLMs
Beyond RDF vs LPG: Operational Ontologies, Hybrid Semantics, and Why We Still Chose a Property Graph | LinkedIn
How to stay sane about âsemantic Graph RAGâ when your job is shipping reliable systems, not winning ontology theology wars. You donât wake up in the morning thinking about OWL profiles or SPARQL entailment regimes.
Knowledge Graphs and GraphRAG have sorta taken over my life the last two months or so, so I thought I would share some very important books for learners and builders
Knowledge Graphs and GraphRAG have sorta taken over my life the last two months or so, so I thought I would share some very important books for learners and builders.
Knowledge Graphs: Iâm going to really enjoy this KG book a lot more, now. Itâs simple reading, in my opinion.
Text as Data: if you work in Data Science and AI, just buy this book right now and then read it. You need to know this. This is my favorite NLP book.
Orange Book (Sorry, long title): that is the best builder book I have found so far. It shows how to build with GraphRAG, and you should check it out. I really enjoyed reading this book and use it all the time.
Just wanted to make some recommendations as I am looking at a lot of my books for ideas, lately. These are diamonds. Find them where you like to shop for books!
#100daysofnetworks | 11 comments on LinkedIn
Knowledge Graphs and GraphRAG have sorta taken over my life the last two months or so, so I thought I would share some very important books for learners and builders
Connected Data London 2024: Semantics, a Disco Ball Jacket and an Escalator Metaphor in Hindsight | Teodora Petkova
This text is about my impressions from Connected Data London 2024. And about working towards a shared space of present and possible collaborative actions based on connected data and content. Intro: Shiny Happy Data People 20 years after the article in which Sir Tim Berners Lee imagined a paper on which you can click withContinue Weaving
The idea that chips and ontology is what you want to short is batsh*t crazy
âThe idea that chips and ontology is what you want to short is batsh*t crazy.â
Whilst I couldn't agree more about how good chips (both the silicon & potato varieties) & ontologies are the context & semantics, as always, are important.....
The Context:
Michael Burry of Big Short fame is shorting AI as a trend with puts on Nvidia & Palantir being disclosed in the latest regulatory filings for his fund Scion Asset Management - $187 million against Nvidia and $912 million against Palantir as of Sept. 30.
The circularity of the latest AI boom and limitations of Large Language Models being amongst many reasons being cited for the apparent AI bubble which Burry believes will burst.
Whether you consider it a formal Ontology or not Palantir & Alex Karp are some of the few to use the 'O word' openly in product marketing - something long considered a brave move by many a frontier technology company!
https://lnkd.in/eh7SAS8P
Ontologies are also a key component of research & development to overcome many of the limitations of contemporary 'AI' systems and the factors contributing to the AI bubble Burry references.
Plug:
Interested in learning about what industry leaders are doing to overcome these limitations and develop AI systems with true reasoning capabilities? Come to this year's Connected Data London conference and engage in the debate, discussions and learning.
This year its at the Leonardo Royal Hotel Tower Bridge on 20th & 21st November and tickets are selling fast!
https://lnkd.in/entfkddD
CNBC article below with video interview:
https://lnkd.in/eHDpnWAW
The idea that chips and ontology is what you want to short is batsh*t crazy.
Pseudo-Knowledge Graphs for Better RAG | by Devashish Datt Mamgain | Oct, 2025 | Towards AI
Pseudo-Knowledge Graphs for Better RAG Retrieval-Augmented Generation (RAG) was supposed to give Large Language Models perfect memory: ask a question, fetch the exact facts, and generate a fluent and âŠ
Using Knowledge Graphs to Accelerate and Standardize AI-Generated Technical Documentation | by Michael Iantosca | Oct, 2025 | Medium
Using Knowledge Graphs to Accelerate and Standardize AI-Generated Technical Documentation for Avalara Connector Guides A Practical Implementation Guide for Structured, Scalable Documentation âŠ
Unifying Data Structures and Knowledge Graphs | by Mark Burgess | Oct, 2025 | Medium
Unifying Data Structures and Knowledge Graphs Why we get confused about the difference between data and knowledge This article is about a technical issue around the use of Knowledge Graphs to âŠ
Using Knowledge Graphs For Inferential Reasoning | by Mark Burgess | Nov, 2025 | Medium
Using Knowledge Graphs For Inferential Reasoning Solving semantic paths integrals for causal outcomes This is a short technical note about using the path solving capabilities of knowledge graphs âŠ
Industry Knowledge Graph Case Study - Semantic Arts
gistBFO is an open-source alignment mapping Semantic Artsâ gist to BFO via subclass linksâdelivering BFO compliance and cross-domain interoperability.
âShorting Ontologyâ â Why Michael Burry Might Not Be Wrong | LinkedIn
âThe idea that chips and ontology is what you want to short is batsh*t crazy.â â Alex Karp, CNBC, November 2025 When Palantirâs CEO, Alex Karp, lashed out at Michael Burry â âBig Shortâ investor who bet against Palantir and Nvidia â he wasnât just defending his balance sheet.
RTVE-Graph - The knowledge graph of the Spanish audiovisual archive
FIAT/IFTA Archive Achievement Awards 2025 - Shortlisted NomineeCategory: Excellence in Media ManagementTitle: "RTVE-Graph - The knowledge graph of the Spanis...
How topology and ontology can build the future network without apology - The Mobile Network
Full network automation is still a way off â but telco execs say it's a must-have, and they know what they need to get there. Can they build a re-imagined OSS to unlock innovation?
The Schema Paradox: Why LPGs Are Both Structured and Free
The Schema Paradox: Why LPGs Are Both Structured and Free
In the world of data and AI, we are often forced to choose between rigid structure and complete flexibility. But labelled property graphs (LPGs) quietly break that rule. They evolve structure through use, building ontology through action.
In this new piece, I explore how LPGs balance order and chaos to form living schemas that grow alongside the data and its context. Integrated with GraphRAG and Applied Knowledge Graphs (AKGs), they become engines of adaptive intelligence, not just models of data.
This isnât theory, it's how modern systems are learning to reason contextually, adapt dynamically and evolve continuously.
Full article: https://lnkd.in/eUdmQjyH
#GraphData #KnowledgeGraph #KG #GraphRAG #AppliedKnowledgeGraph #AKG #LPG #DataArchitecture #AI #KnowledgeEngineering
The Schema Paradox: Why LPGs Are Both Structured and Free
One question keeps coming up about UDA: why don't we call them ontologies?
One question keeps coming up about UDA: why don't we call them ontologies?
We actually tried that. People said 'ontology' was too abstract, too academic, that they felt dumb. So we were asked to step back: what were we really asking for? Conceptual models of business domains. Turns out people already had the right intuitions: domain-driven design, domain graph services, database modeling, etc.
We literally did a search-replace: 'ontology' became 'domain model'. They understood overnight đ
But there's more to it. Most ontology frameworks are just RDF, OWL, and SHACL. Upper does use those as building blocks and adds what's missing: information architecture, federation for collaborative modeling, and bootstrap properties. Domain models that are self-describing, self-referencing, self-governing. 'Ontology' just doesn't capture that precision.
So 'domain model' it is, not 'ontology'.
One question keeps coming up about UDA: why don't we call them ontologies?
Turn Text Into a Knowledge Graph with 70B LLM on DGX Spark
Looking to run local GraphRAG or other graph analytics use cases? With DGX Spark, you can prepare your local text files for graph use cases at your desk.In t...
Text2KGBench-LettrIA: A Refined Benchmark for Text2Graph Systems
đ LLMs can be powerful tools to extract information from texts and automatically populate Knowledge Graphs guided by ontologies given as inputs. BUT how good are they? To reply to this question, we need benchmarks!
đĄ With Lettria, we build the Text2KGBench-LettrIA benchmark covering 19 different ontologies in various domains (company, film, food, politician, sports, monument, etc.) and consisting of near 5k sentences strictly annotated with triples conforming to these ontologies (208 classes, 426 properties) yielding more than 17k triples.
What's more? We throw a lot of compute to compare the performance and efficiency of numerous Closed LLMs models and variants (GPT4, Claude 3, Gemini) and numerous fine-tuned Open Weights models (Mistral 3, Qwen 3, Gemma 3, Phi 4).
âšKey take-away: when being provided with high quality data, fine-tuned open models largely outperform larger, proprietary counterparts!
đ Curious about the detailed results?
Read our paper at https://lnkd.in/e-EZCjWm
See our presentation at https://lnkd.in/eEdCCpdA that I have just presented at the Knowledge Base Construction from Pre-Trained Language Models Workshop colocated with the ISWC - International Semantic Web Conference.
You want to use these results in your operations? Sign-up for using the newly released PERSEUS model, https://lnkd.in/e7exyJHc
Joint work with Julien PLU, Oscar Moreno Escobar, Edouard Trouillez, Axelle Gapin, Pasquale Lisena, Thibault Ehrhart
#iswc2025 #LLMs #KnowledgeGraphs #NLP #Research
EURECOM, Charles Borderie
Clinical Knowledge Graph (CKG) is a platform with twofold objective: 1) build a graph database with experimental data and data imported from diverse biomedical databases 2) automate knowledge disco...
The audiobook version of "Knowledge Graphs and LLMs in Action" is now available
đ§ Exciting news! The audiobook version of "Knowledge Graphs and LLMs in Action" is now available!
Are you busy but would love to learn how to build powerful and explainable AI solutions? No problem! Manning has just released the audio version of our book.
Now you can listen while you're:
- Running and training for your next marathon đ
- Commuting to the office đ
- Sitting in the parking lot waiting for your kids to finish their violin lesson đ»
Your schedule is packed, but that shouldn't stop you from mastering these powerful AI techniques.
Get your copy here: https://hubs.la/Q03MVhhk0
And don't forget to use discount code: lagraphs40 for 40% off!
Clever solutions for smart people.
The audiobook version of "Knowledge Graphs and LLMs in Action" is now available
Ontology Bill of Material? Do we really need it?
In software engineering, we have SBOMs, Maven, Gradle, pip, and npm. We have decades of best practices for dependency management, version pinning, and granular control. We can exclude transitive dependencies we don't want.
In ontology engineering and semantic modeling... we have owl:imports.
We're trying to build mission-critical, enterprise-scale knowledge graphs, but our core dependency mechanism often feels like a step back in time. We talk about logical rigor, but we're living in "dependency hell."
So:
"How do you manage different versions of an ontology? How do you go through the complexity of imports? How do you propagate changes?"
And the answer right now is: With great difficulty! and a lot of custom workarounds.
The owl:imports axiom is a logical "all-or-nothing" merge. It's defined as a transitive closure. This is the direct cause of our most common and painful problems:
- The "Diamond Problem": Your ontology imports Model-A (which needs Common-v1) and Model-B (which needs Common-v2). Your tool just pulls in both, creating a logical mess of conflicting axioms. A software build would fail and force you to resolve this.
- Model Bloat: You want to use one class from a massive upper ontology (e.g schema .org)? Congratulations, you just imported the entire thing, plus everything it imports. And good luck with that RAM spikes, lags, ...
- No Granular Control: This is the big one. In Maven or Gradle, you can exclude a problematic transitive dependency. In OWL, this is simply not possible at the specification level. You get everything.
So, yes, we need the concept of an "Ontology Bill of Materials" (OBOM).
We need a manifest file that lives with our ontology (and helps us to build it) and provides a reproducible "build." We need our tools (Protege, OWL API, ...) to treat this as a first-class citizen.
This manifest would:
-List all direct dependencies.
-Pin their exact versions (via VersionIRI or even a content hash).
-Resolve and list the full transitive dependency graph, so we know exactly what we are loading.
-Detects problematic imports, cyclic dependencies, ...
The "duct tape" we use today like custom build scripts, manually copy paste of element and so on are just admissions that owl:imports is not enough.
It's time to adopt the mature engineering practices that software teams have relied on for decades.
So how do you deal with complex ontology/model dependencies? How do you and your teams manage this chaos today?
#Ontology #KnowledgeGraph #SemanticWeb #RDF #OWL | 39 comments on LinkedIn
The O-word, âontologyâ is here! Traditionally, you couldnât say the word âontologyâ in tech circles without getting a side-eye. Now? Everyoneâs suddenly an ontology expert. And honestly⊠Iâm here for it.
As someone whoâs been deep in this space, this moment is exciting. Weâre finally having the right conversations about semantic interoperability and the relationship with Agentic AI.
But hereâs the thing: before we reinvent the wheel, we need to understand the road already paved.
đ§ Homework if youâre diving into this space (link in comments):
1ïžâŁ Read the original Semantic Web vision article by Tim BernersâLee, James Hendler & Ora Lassila It laid out a future weâre finally ready for. Before you complain that âitâs complicatedâ or âthat never worked and failedâ, recall that this was a vision that laid out a roadmap of what was needed. Learn about the W3C standards that have emerged from this vision. Honored that I got to write a book with Ora!
2ïžâŁ Explore ISWC (International Semantic Web Conference) This scientific community was created to research what would it take to fulfill the Semantic Web vision. Itâs the top scientific conference in this space, running for over 20 years. Iâm proud to call it my academic home (been attending since 2008). ISWC will take place next week in Nara, Japan and Iâm excited to be keynoting the Knowledge Base Construction from Pre-Trained Language Models Workshop and be part of the Panel: Reimagining Knowledge: The Future and Relevance of Symbolic Representation in the Age of LLMs. Take a look at the program and accepted papers if you want to know where the puck is heading!
3ïžâŁ Learn the history of knowledge graphs. It didnât start with Google. Itâs not just about graph databases. The Semantic Web has been a huge influence, in addition to so many events over 50+ years that have worked to connect data and knowledge at scale. Prof Claudio Gutierrez and I wrote a paper that goes into this history.
Why this matters? Because weâre in a moment where many talk about âsemanticâ and âknowledgeâ, but often without acknowledging the deep foundations. AI agents, interoperability, and scalable intelligence depend on these foundations. The tech, standards and tools exist. If you rebuild from scratch, you waste time. But if you stand on these shoulders, you build faster and smarter.
Learn about the W3C standards: RDF, OWL, SPARQL, SHACL, SKOS, etc. Take a look at open source projects like Apache Jena, RDFLib, QLever, Protege. If somethingâs broken, or if you donât like how itâs done, donât start from scratch. Improve it. Contribute. Build on whatâs already there.
So if youâre posting about ontologies or knowledge graphs, please ask yourself:
- Did I look at the classical semantic web work (yes, that 2001 article) and the history of knowledge graphs?
- Am I building on the shoulders of giants, rather than reâstarting?
- If I disagree with a standard/open source project, am I choosing to contribute instead of ignoring it? | 65 comments on LinkedIn
UB to offer a fully online graduate degree in ontology
The applied ontology degree will prepare students from around the world for work in this rapidly growing interdisciplinary branch of information science.