Found 302 bookmarks
Custom sorting
Language, Graphs, and AI in Industry
Language, Graphs, and AI in Industry
Over the past 5 years, news about AI has been filled with amazing research – at first focused on graph neural networks (GNNs) and more recently about large language models (LLMs). Understand that business tends to use connected data – networks, graphs – whether you’re untangling supply networks in Manufacturing, working on drug discovery for Pharma, or mitigating fraud in Finance. Starting from supplier agreements, bill of materials, internal process docs, sales contracts, etc., there’s a graph inside nearly every business process, one that is defined by language. This talk addresses how to leverage both natural language and graph technologies together for AI applications in industry. We’ll look at how LLMs get used to build and augment graphs, and conversely how graph data gets used to ground LLMs for generative AI use cases in industry – where a kind of “virtuous cycle” is emerging for feedback loops based on graph data. Our team has been engaged, on the one hand, with enterprise use cases in manufacturing. On the other hand we’ve worked as intermediaries between research teams funded by enterprise and open source projects needed by enterprise – particularly in the open source ecosystem for AI models. Also, there are caveats; this work is not simple. Translating from latest research into production-ready code is especially complex and expensive. Let’s examine caveats which other teams should understand, and look toward practical examples.
·derwen.ai·
Language, Graphs, and AI in Industry
LLMs have revolutionized AI. Do we still need knowledge models and taxonomies, and why? | LinkedIn
LLMs have revolutionized AI. Do we still need knowledge models and taxonomies, and why? | LinkedIn
Although I have of course heard this question more often in recent months than in all the years before, it is really just a reiteration of the question of all questions, which is probably the most fundamental question of all for AI: How much human (or symbolic AI) does statistical AI need? With ever
·linkedin.com·
LLMs have revolutionized AI. Do we still need knowledge models and taxonomies, and why? | LinkedIn
Ontologies and Knowledge Graphs offer a way to connect embedding vectors to structured knowledge
Ontologies and Knowledge Graphs offer a way to connect embedding vectors to structured knowledge
Ontologies and Knowledge Graphs offer a way to connect embedding vectors to structured knowledge, enhancing their meaning and explainability. Let's delve into… | 25 comments on LinkedIn
Ontologies and Knowledge Graphs offer a way to connect embedding vectors to structured knowledge,
·linkedin.com·
Ontologies and Knowledge Graphs offer a way to connect embedding vectors to structured knowledge
Ontologies are the backbone of the Semantic Web bridging the gap between human and machine understanding
Ontologies are the backbone of the Semantic Web bridging the gap between human and machine understanding
Ontologies are the backbone of the Semantic Web bridging the gap between human and machine understanding. They define the concepts and relationships that… | 29 comments on LinkedIn
Ontologies are the backbone of the Semantic Web bridging the gap between human and machine understanding
·linkedin.com·
Ontologies are the backbone of the Semantic Web bridging the gap between human and machine understanding
Two Heads Are Better Than One: Integrating Knowledge from Knowledge Graphs and Large Language Models for Entity Alignment
Two Heads Are Better Than One: Integrating Knowledge from Knowledge Graphs and Large Language Models for Entity Alignment
Entity alignment, which is a prerequisite for creating a more comprehensive Knowledge Graph (KG), involves pinpointing equivalent entities across disparate KGs. Contemporary methods for entity alignment have predominantly utilized knowledge embedding models to procure entity embeddings that encapsulate various similarities-structural, relational, and attributive. These embeddings are then integrated through attention-based information fusion mechanisms. Despite this progress, effectively harnessing multifaceted information remains challenging due to inherent heterogeneity. Moreover, while Large Language Models (LLMs) have exhibited exceptional performance across diverse downstream tasks by implicitly capturing entity semantics, this implicit knowledge has yet to be exploited for entity alignment. In this study, we propose a Large Language Model-enhanced Entity Alignment framework (LLMEA), integrating structural knowledge from KGs with semantic knowledge from LLMs to enhance entity alignment. Specifically, LLMEA identifies candidate alignments for a given entity by considering both embedding similarities between entities across KGs and edit distances to a virtual equivalent entity. It then engages an LLM iteratively, posing multiple multi-choice questions to draw upon the LLM's inference capability. The final prediction of the equivalent entity is derived from the LLM's output. Experiments conducted on three public datasets reveal that LLMEA surpasses leading baseline models. Additional ablation studies underscore the efficacy of our proposed framework.
·arxiv.org·
Two Heads Are Better Than One: Integrating Knowledge from Knowledge Graphs and Large Language Models for Entity Alignment
Knowledge Graphs Achieve Superior Reasoning versus Vector Search alone for Retrieval Augmentation
Knowledge Graphs Achieve Superior Reasoning versus Vector Search alone for Retrieval Augmentation
Knowledge Graphs Achieve Superior Reasoning versus Vector Search alone for Retrieval Augmentation 🔗 As artificial intelligence permeates business… | 29 comments on LinkedIn
Knowledge Graphs Achieve Superior Reasoning versus Vector Search alone for Retrieval Augmentation
·linkedin.com·
Knowledge Graphs Achieve Superior Reasoning versus Vector Search alone for Retrieval Augmentation
Understand and Exploit GenAI With Gartner’s New Impact Radar
Understand and Exploit GenAI With Gartner’s New Impact Radar
Use Gartner’s impact radar for generative AI to plan investments and strategy with four key themes in mind: ☑️Model-related innovations ☑️Model performance and AI safety ☑️Model build and data-related ☑️AI-enabled applications Explore all 25 technologies and trends: https://www.gartner.com/en/articles/understand-and-exploit-gen-ai-with-gartner-s-new-impact-radar
·gartner.com·
Understand and Exploit GenAI With Gartner’s New Impact Radar
The Role of the Ontologist in the Age of LLMs
The Role of the Ontologist in the Age of LLMs
What do we mean when we say something is a kind of thing? I’ve been wrestling with that question a great deal of late, partly because I think the role of the ontologist transcends the application of knowledge graphs, especially as I’ve watched LLMs and Llamas become a bigger part of the discussion.
·ontologist.substack.com·
The Role of the Ontologist in the Age of LLMs
Knowledge Engineering Using Large Language Models
Knowledge Engineering Using Large Language Models
Knowledge engineering is a discipline that focuses on the creation and maintenance of processes that generate and apply knowledge. Traditionally, knowledge engineering approaches have focused on knowledge expressed in formal languages. The emergence of large language models and their capabilities to effectively work with natural language, in its broadest sense, raises questions about the foundations and practice of knowledge engineering. Here, we outline the potential role of LLMs in knowledge engineering, identifying two central directions: 1) creating hybrid neuro-symbolic knowledge systems; and 2) enabling knowledge engineering in natural language. Additionally, we formulate key open research questions to tackle these directions.
·drops.dagstuhl.de·
Knowledge Engineering Using Large Language Models