Found 2087 bookmarks
Custom sorting
yfiles jupyter graphs for sparql: The open-source adapter for working with RDF databases
yfiles jupyter graphs for sparql: The open-source adapter for working with RDF databases
๐Ÿ“ฃHey Semantic Web/SPARQL/RDF/OWL/Knowledge graph community: Finally! We heard you! I just got this fresh from the dev kitchen: ๐ŸŽ‰ Try our free SPARQL query result visualization widget for Jupyter Notebooks! Based on our popular generic graph visualization widget for Jupyter, this widget makes it super convenient to add beautiful graph visualizations of your SPARQL queries to your Jupyter Notebooks. Check out the example notebooks for Google Colab in the GitHub repo https://lnkd.in/e8JP-eiM โœจ This is a pre-1.0-release but already quite capable, as it builds on the well-tested generic widget. We are looking to get your feedback on the features for the final release, so please do take a look and let me know your feedback here, or tell us on GitHub! What features are you missing? What do you like best about the widget? Let me know in the comments and I'll talk to the devs ๐Ÿ˜Š #sparql #rdf #owl #semanticweb #knowledgegraphs #visualization
GitHub - yWorks/yfiles-jupyter-graphs-for-sparql: The open-source adapter for working with RDF databas
ยทlinkedin.comยท
yfiles jupyter graphs for sparql: The open-source adapter for working with RDF databases
RDF-to-Gephi
RDF-to-Gephi
I have never been a fan of the "bubble and arrows" kind of graph visualizations. It is generaly useless. But when you can see the entire graph, and can tune the rendering, you start understanding the topology and structure - and ultimately you can tell a story with your graph (and that's what we all love, stories). Gephi is a graph visualization tool to tell these sort of stories with graphs, that has been around for 15 (20 ?) years. Interestingly, while quite a number of Gephi plugins exist to load data (including from neo4j), no decent working plugin exist to load RDF data (yes, there was a "SemanticWebImport" plugin, but it looks outdated, with an old documentation, and does not work with latest - 0.10 - version of Gephi). This doesn't tell anything good for the semantic knowledge graph community. A few weeks ago I literally stumbled upon an old project we developed in 2017 to convert RDF graphs into the GEXF format that can be loaded in Gephi. Time for a serious cleaning, reengineering, and packaging ! So here is a v1.0.0 of the rebranded rdf2gephi utility tool ! The tool runs as a command line that can read an RDF knowledge graph (from files or a SPARQL endpoint), execute a set of SPARQL queries, and turn that into a set of nodes and edges in a GEXF file. rdf2gephi provides default queries to run a simple conversion without any parameters, but most of the time you will want to tune how your graph is turned into GEXF nodes and edges (for example, in my case, `org:Membership` entities relating `foaf:Persons` with `org:Organizations` are not turned into nodes, but into edges, and I want to ignore some other entities). And then what ? then you can load the GEXF file in Gephi, and run a few operations to showcase your graph (see the little screencast video I recorded) : run a layout algorithm, color nodes based on their rdf:type or another attribute you converted, change their size according to the (in-)degree, detect clusters based on a modularity algorithm, etc. etc. - and then export as SVG, PNG, or another format. Also, one of the cool feature supported by the GEXF format are dynamic graphs, where each nodes and edges can be associated to a date range. You can then see your graph evolving through time, like in a movie ! I hope I will be able to tell a more concrete Gephi-powered, RDF-backed graph-story in a future post ! All links in comments.
ยทlinkedin.comยท
RDF-to-Gephi
Specifications to define data assets managed as products
Specifications to define data assets managed as products
๐Ÿ“š In recent years, several specifications have emerged to define data assets managed as products. Today, two main types of specifications exist: 1๏ธโƒฃ ๐——๐—ฎ๐˜๐—ฎ ๐—–๐—ผ๐—ป๐˜๐—ฟ๐—ฎ๐—ฐ๐˜ ๐—ฆ๐—ฝ๐—ฒ๐—ฐ๐—ถ๐—ณ๐—ถ๐—ฐ๐—ฎ๐˜๐—ถ๐—ผ๐—ป (๐——๐—–๐—ฆ): Focused on describing the data asset and its associated metadata. 2๏ธโƒฃ ๐——๐—ฎ๐˜๐—ฎ ๐—ฃ๐—ฟ๐—ผ๐—ฑ๐˜‚๐—ฐ๐˜ ๐—ฆ๐—ฝ๐—ฒ๐—ฐ๐—ถ๐—ณ๐—ถ๐—ฐ๐—ฎ๐˜๐—ถ๐—ผ๐—ป (๐——๐—ฃ๐—ฆ): Focused on describing the data product that manages and exposes the data asset. ๐Ÿ‘‰ The ๐—ข๐—ฝ๐—ฒ๐—ป ๐——๐—ฎ๐˜๐—ฎ ๐—–๐—ผ๐—ป๐˜๐—ฟ๐—ฎ๐—ฐ๐˜ ๐—ฆ๐˜๐—ฎ๐—ป๐—ฑ๐—ฎ๐—ฟ๐—ฑ (๐—ข๐——๐—–๐—ฆ) by Bitol is an example of the first specification type, while the ๐——๐—ฎ๐˜๐—ฎ ๐—ฃ๐—ฟ๐—ผ๐—ฑ๐˜‚๐—ฐ๐˜ ๐——๐—ฒ๐˜€๐—ฐ๐—ฟ๐—ถ๐—ฝ๐˜๐—ผ๐—ฟ ๐—ฆ๐—ฝ๐—ฒ๐—ฐ๐—ถ๐—ณ๐—ถ๐—ฐ๐—ฎ๐˜๐—ถ๐—ผ๐—ป (๐——๐—ฃ๐——๐—ฆ) by the Open Data Mesh Initiative represents the second. ๐Ÿค” But what are the key differences between these two approaches? Where do they overlap, and how can they complement each other? More broadly, are they friends, enemies, or frenemies? ๐Ÿ”Ž I explored these questions in my latest blog post. The image below might give away some spoilers, but if you're curious about the full reasoning, read the post. โค๏ธ I'd love to hear your thoughts! #TheDataJoy #DataContracts #DataProducts #DataGovernance | 29 comments on LinkedIn
specifications have emerged to define data assets managed as products
ยทlinkedin.comยท
Specifications to define data assets managed as products
SymAgent: A Neural-Symbolic Self-Learning Agent Framework for Complex Reasoning over Knowledge Graphs
SymAgent: A Neural-Symbolic Self-Learning Agent Framework for Complex Reasoning over Knowledge Graphs
LLMs that automatically fill knowledge gaps - too good to be true? Large Language Models (LLMs) often stumble in logical tasks due to hallucinations, especially when relying on incomplete Knowledge Graphs (KGs). Current methods naively trust KGs as exhaustive truth sources - a flawed assumption in real-world domains like healthcare or finance where gaps persist. SymAgent is a new framework that approaches this problem by makingย KGs active collaborators, not passive databases. Its dual-module design combines symbolic logic with neural flexibility: 1. Agent-Plannerย extracts implicit rules from KGs (e.g., "If drug X interacts with Y, avoid co-prescription") to decompose complex questions into structured steps. 2. Agent-Executorย dynamically pulls external data when KG triples are missing, bypassing the "static repository" limitation. Perhaps most impressively, SymAgentโ€™s self-learning observes failed reasoning paths to iteratively refine its strategyย andย flag missing KG connections - achieving 20-30% accuracy gains over raw LLMs. Equipped with SymAgent, even 7B models rival their much larger counterparts by leveraging this closed-loop system. It would be great if LLMs were able to autonomously curate knowledge and adapt to domain shifts without costly retraining. But are we there yet? Are hybrid architectures like SymAgent the future? โ†“ Liked this post? Join my newsletter with 50k+ readers that breaks down all you need to know about the latest LLM research: llmwatch.com ๐Ÿ’ก
ยทlinkedin.comยท
SymAgent: A Neural-Symbolic Self-Learning Agent Framework for Complex Reasoning over Knowledge Graphs
key ontology standards
key ontology standards
What are the key ontology standards you should have in mind? Ontology standards are crucial for knowledge representation and reasoning in AI and dataโ€ฆ | 32 comments on LinkedIn
key ontology standards
ยทlinkedin.comยท
key ontology standards
Pathway to Artificial General Intelligence (AGI)
Pathway to Artificial General Intelligence (AGI)
๐ŸŒŸ Pathway to Artificial General Intelligence (AGI) ๐ŸŒŸ This is my view on the evolutionary steps toward AGI: 1๏ธโƒฃ Large Language Models (LLMs): Language modelsโ€ฆ
Pathway to Artificial General Intelligence (AGI)
ยทlinkedin.comยท
Pathway to Artificial General Intelligence (AGI)
Nakala : from an RDF dataset to a query UI in minutes - SHACL automated generation and Sparnatural - Sparna Blog
Nakala : from an RDF dataset to a query UI in minutes - SHACL automated generation and Sparnatural - Sparna Blog
Here is a usecase of an automated version of Sparnatural submitted as an example for Veronika Heimsbakkโ€™s SHACL for theย Practitionerย upcoming book about the Shapes Constraint Language (SHACL). โ€œ The Sparnatural knowledge graph explorerย leverages SHACL specifications to drive a user interface (UI) that allows end users to easily discover the content of an RDF graph. Whatโ€ฆ
ยทblog.sparna.frยท
Nakala : from an RDF dataset to a query UI in minutes - SHACL automated generation and Sparnatural - Sparna Blog
Purpose-Driven Taxonomy Design | LinkedIn
Purpose-Driven Taxonomy Design | LinkedIn
In my last post, AI Supported Taxonomy Term Generation, I used an LLM to help generate candidate terms for the revision of a topic taxonomy that had fallen out of sync with the content it was meant to tag. In that example, the taxonomy in question is for the "Insights" articles on my consulting webs
Purpose-Driven Taxonomy Design
ยทlinkedin.comยท
Purpose-Driven Taxonomy Design | LinkedIn
Ontology is not only about data
Ontology is not only about data
Ontology is not only about data! Many people think that ontologies are only about data (information). But an information model provides only one perspectiveโ€ฆ | 85 comments on LinkedIn
Ontology is not only about data
ยทlinkedin.comยท
Ontology is not only about data
Enterprise Ontology: A Human-Centric Approach to Understanding the Essence of Organisation : Dietz, Jan L. G., Mulder, Hans B. F.: Amazon.nl: Boeken
Enterprise Ontology: A Human-Centric Approach to Understanding the Essence of Organisation : Dietz, Jan L. G., Mulder, Hans B. F.: Amazon.nl: Boeken
Enterprise Ontology: A Human-Centric Approach to Understanding the Essence of Organisation : Dietz, Jan L. G., Mulder, Hans B. F.: Amazon.nl: Boeken
Enterprise Ontology
ยทamazon.nlยท
Enterprise Ontology: A Human-Centric Approach to Understanding the Essence of Organisation : Dietz, Jan L. G., Mulder, Hans B. F.: Amazon.nl: Boeken
The Semantic Web
The Semantic Web
How crazy is it that over 20 years ago, Berners-Lee, Hendler, and Lassila laid out a vision in 'The Semantic Web' that we're only now fully appreciating as theโ€ฆ | 24 comments on LinkedIn
ยทlinkedin.comยท
The Semantic Web