Found 62 bookmarks
Newest
OpenGraph: Open-Vocabulary Hierarchical 3D Graph Representation in Large-Scale Outdoor Environments
OpenGraph: Open-Vocabulary Hierarchical 3D Graph Representation in Large-Scale Outdoor Environments
Environment maps endowed with sophisticated semantics are pivotal for facilitating seamless interaction between robots and humans, enabling them to effectively carry out various tasks. Open-vocabulary maps, powered by Visual-Language models (VLMs), possess inherent advantages, including multimodal retrieval and open-set classes. However, existing open-vocabulary maps are constrained to closed indoor scenarios and VLM features, thereby diminishing their usability and inference capabilities. Moreover, the absence of topological relationships further complicates the accurate querying of specific instances. In this work, we propose OpenGraph, a representation of open-vocabulary hierarchical graph structure designed for large-scale outdoor environments. OpenGraph initially extracts instances and their captions from visual images using 2D foundation models, encoding the captions with features to enhance textual reasoning. Subsequently, 3D incremental panoramic mapping with feature embedding is achieved by projecting images onto LiDAR point clouds. Finally, the environment is segmented based on lane graph connectivity to construct a hierarchical graph. Validation results from real public dataset SemanticKITTI demonstrate that, even without fine-tuning the models, OpenGraph exhibits the ability to generalize to novel semantic classes and achieve the highest segmentation and query accuracy. The source code of OpenGraph is publicly available at https://github.com/BIT-DYN/OpenGraph.
·arxiv.org·
OpenGraph: Open-Vocabulary Hierarchical 3D Graph Representation in Large-Scale Outdoor Environments
Personalized Audiobook Recommendations at Spotify Through Graph Neural Networks
Personalized Audiobook Recommendations at Spotify Through Graph Neural Networks
In the ever-evolving digital audio landscape, Spotify, well-known for its music and talk content, has recently introduced audiobooks to its vast user base. While promising, this move presents significant challenges for personalized recommendations. Unlike music and podcasts, audiobooks, initially available for a fee, cannot be easily skimmed before purchase, posing higher stakes for the relevance of recommendations. Furthermore, introducing a new content type into an existing platform confronts extreme data sparsity, as most users are unfamiliar with this new content type. Lastly, recommending content to millions of users requires the model to react fast and be scalable. To address these challenges, we leverage podcast and music user preferences and introduce 2T-HGNN, a scalable recommendation system comprising Heterogeneous Graph Neural Networks (HGNNs) and a Two Tower (2T) model. This novel approach uncovers nuanced item relationships while ensuring low latency and complexity. We decouple users from the HGNN graph and propose an innovative multi-link neighbor sampler. These choices, together with the 2T component, significantly reduce the complexity of the HGNN model. Empirical evaluations involving millions of users show significant improvement in the quality of personalized recommendations, resulting in a +46% increase in new audiobooks start rate and a +23% boost in streaming rates. Intriguingly, our model's impact extends beyond audiobooks, benefiting established products like podcasts.
·arxiv.org·
Personalized Audiobook Recommendations at Spotify Through Graph Neural Networks
Why do LangChain and Autogen use graphs? Here are the top reasons
Why do LangChain and Autogen use graphs? Here are the top reasons
LLM frameworks like LangChain are moving towards a graph-based approach for handling their workflows. This represents the initial steps of a much larger… | 90 comments on LinkedIn
Why do LangChain and Autogen use graphs? Here are the top reasons
·linkedin.com·
Why do LangChain and Autogen use graphs? Here are the top reasons
Language, Graphs, and AI in Industry
Language, Graphs, and AI in Industry
Over the past 5 years, news about AI has been filled with amazing research – at first focused on graph neural networks (GNNs) and more recently about large language models (LLMs). Understand that business tends to use connected data – networks, graphs – whether you’re untangling supply networks in Manufacturing, working on drug discovery for Pharma, or mitigating fraud in Finance. Starting from supplier agreements, bill of materials, internal process docs, sales contracts, etc., there’s a graph inside nearly every business process, one that is defined by language. This talk addresses how to leverage both natural language and graph technologies together for AI applications in industry. We’ll look at how LLMs get used to build and augment graphs, and conversely how graph data gets used to ground LLMs for generative AI use cases in industry – where a kind of “virtuous cycle” is emerging for feedback loops based on graph data. Our team has been engaged, on the one hand, with enterprise use cases in manufacturing. On the other hand we’ve worked as intermediaries between research teams funded by enterprise and open source projects needed by enterprise – particularly in the open source ecosystem for AI models. Also, there are caveats; this work is not simple. Translating from latest research into production-ready code is especially complex and expensive. Let’s examine caveats which other teams should understand, and look toward practical examples.
·derwen.ai·
Language, Graphs, and AI in Industry
Let Your Graph Do the Talking: Encoding Structured Data for LLMs
Let Your Graph Do the Talking: Encoding Structured Data for LLMs
𝗟𝗲𝘁 𝘆𝗼𝘂𝗿 𝗱𝗮𝘁𝗮 𝘀𝗽𝗲𝗮𝗸! Inject structured data directly with GraphTokens and supercharge your LLM's reasoning abilities. Our exciting research is… | 16 comments on LinkedIn
·linkedin.com·
Let Your Graph Do the Talking: Encoding Structured Data for LLMs
The Intersection of Graphs and Language Models
The Intersection of Graphs and Language Models
The Intersection of Graphs and Language Models 🔲 ⚫ Large language models (LLMs) have rapidly advanced, displaying impressive abilities in comprehending… | 27 comments on LinkedIn
The Intersection of Graphs and Language Models
·linkedin.com·
The Intersection of Graphs and Language Models
LangGraph: Multi-Agent Workflows
LangGraph: Multi-Agent Workflows
Links * Python Examples * JS Examples * YouTube Last week we highlighted LangGraph - a new package (available in both Python and JS) to better enable creation of LLM workflows containing cycles, which are a critical component of most agent runtimes. As a part of the launch, we highlighted two simple runtimes:
a second set of use cases for langgraph - multi-agent workflows. In this blog we will cover:What does "multi-agent" mean?Why are "multi-agent" workflows interesting?Three concrete examples of using LangGraph for multi-agent workflowsTwo examples of third-party applications built on top of LangGraph using multi-agent workflows (GPT-Newspaper and CrewAI)Comparison to other frameworks (Autogen and CrewAI)
·blog.langchain.dev·
LangGraph: Multi-Agent Workflows
🦜🕸️LangGraph | 🦜️🔗 Langchain
🦜🕸️LangGraph | 🦜️🔗 Langchain
⚡ Building language agents as graphs ⚡
🦜🕸️LangGraph⚡ Building language agents as graphs ⚡Overview​LangGraph is a library for building stateful, multi-actor applications with LLMs, built on top of (and intended to be used with) LangChain. It extends the LangChain Expression Language with the ability to coordinate multiple chains (or actors) across multiple steps of computation in a cyclic manner. It is inspired by Pregel and Apache Beam. The current interface exposed is one inspired by NetworkX.The main use is for adding cycles to your LLM application. Crucially, this is NOT a DAG framework. If you want to build a DAG, you should use just use LangChain Expression Language.Cycles are important for agent-like behaviors, where you call an LLM in a loop, asking it what action to take next.
·python.langchain.com·
🦜🕸️LangGraph | 🦜️🔗 Langchain
Graph & Geometric ML in 2024: Where We Are and What’s Next (Part I — Theory & Architectures)
Graph & Geometric ML in 2024: Where We Are and What’s Next (Part I — Theory & Architectures)
Trends and recent advancements in Graph and Geometric Deep Learning
Following the tradition from previous years, we interviewed a cohort of distinguished and prolific academic and industrial experts in an attempt to summarise the highlights of the past year and predict what is in store for 2024. Past 2023 was so ripe with results that we had to break this post into two parts. This is Part I focusing on theory & new architectures,
·towardsdatascience.com·
Graph & Geometric ML in 2024: Where We Are and What’s Next (Part I — Theory & Architectures)
pacoid (Paco Xander Nathan)
pacoid (Paco Xander Nathan)
Python open source projects; natural language meets graph technologies; graph topological transformations; graph levels of detail (abstraction layers)
·huggingface.co·
pacoid (Paco Xander Nathan)
Neural algorithmic reasoning without intermediate supervision
Neural algorithmic reasoning without intermediate supervision
Neural algorithmic reasoning focuses on building models that can execute classic algorithms. It allows one to combine the advantages of neural networks, such as handling raw and noisy input data, with theoretical guarantees and strong generalization of algorithms. Assuming we have a neural network capable of solving a classic algorithmic task, we can incorporate it into a more complex pipeline and train end-to-end. For instance, if we have a neural solver aligned to the shortest path problem, it can be used as a building block for a routing system that accounts for complex and dynamically changing traffic conditions. In our work [ref1], we study algorithmic reasoners trained only from input-output pairs, in contrast to current state-of-the-art approaches that utilize the trajectory of a given algorithm. We propose several architectural modifications and demonstrate how standard contrastive learning techniques can regularize intermediate computations of the models without appealing to any predefined algorithm’s trajectory.
·research.yandex.com·
Neural algorithmic reasoning without intermediate supervision
Data gauging, covariance and equivariance | Maurice Weiler
Data gauging, covariance and equivariance | Maurice Weiler
The numerical representation of data is often ambiguous. This leads to a gauge theoretic view on data, requiring covariant or equivariant neural networks which are reviewed in this blog post.
·maurice-weiler.gitlab.io·
Data gauging, covariance and equivariance | Maurice Weiler
Graph Learning Meets Artificial Intelligence
Graph Learning Meets Artificial Intelligence
By request, here are the slides from our #neurips2023 presentation yesterday! We really enjoyed the opportunity to present the different aspects of the work… | 18 comments on LinkedIn
·linkedin.com·
Graph Learning Meets Artificial Intelligence
A Survey of Graph Meets Large Language Model: Progress and Future Directions
A Survey of Graph Meets Large Language Model: Progress and Future Directions
Graph plays a significant role in representing and analyzing complex relationships in real-world applications such as citation networks, social networks, and biological data. Recently, Large Language Models (LLMs), which have achieved tremendous success in various domains, have also been leveraged in graph-related tasks to surpass traditional Graph Neural Networks (GNNs) based methods and yield state-of-the-art performance. In this survey, we first present a comprehensive review and analysis of existing methods that integrate LLMs with graphs. First of all, we propose a new taxonomy, which organizes existing methods into three categories based on the role (i.e., enhancer, predictor, and alignment component) played by LLMs in graph-related tasks. Then we systematically survey the representative methods along the three categories of the taxonomy. Finally, we discuss the remaining limitations of existing studies and highlight promising avenues for future research. The relevant papers are summarized and will be consistently updated at: https://github.com/yhLeeee/Awesome-LLMs-in-Graph-tasks.
·arxiv.org·
A Survey of Graph Meets Large Language Model: Progress and Future Directions
TacticAI: an AI assistant for football tactics using Graph AI
TacticAI: an AI assistant for football tactics using Graph AI
"TacticAI: an AI assistant for football tactics" by Zhe W., Petar Veličković, Daniel Hennes, Nenad Tomašev, Laurel Prince, Yoram Bachrach, Romuald Elie, Kevin… | 28 comments on LinkedIn
TacticAI: an AI assistant for football tactics
·linkedin.com·
TacticAI: an AI assistant for football tactics using Graph AI
Graph of Thoughts: Solving Elaborate Problems with Large Language Models
Graph of Thoughts: Solving Elaborate Problems with Large Language Models
We introduce Graph of Thoughts (GoT): a framework that advances prompting capabilities in large language models (LLMs) beyond those offered by paradigms such as Chain-of-Thought or Tree of Thoughts (ToT). The key idea and primary advantage of GoT is the ability to model the information generated by an LLM as an arbitrary graph, where units of information ("LLM thoughts") are vertices, and edges correspond to dependencies between these vertices. This approach enables combining arbitrary LLM thoughts into synergistic outcomes, distilling the essence of whole networks of thoughts, or enhancing thoughts using feedback loops. We illustrate that GoT offers advantages over state of the art on different tasks, for example increasing the quality of sorting by 62% over ToT, while simultaneously reducing costs by 31%. We ensure that GoT is extensible with new thought transformations and thus can be used to spearhead new prompting schemes. This work brings the LLM reasoning closer to human thinking or brain mechanisms such as recurrence, both of which form complex networks.
·arxiv.org·
Graph of Thoughts: Solving Elaborate Problems with Large Language Models
Vectors need Graphs!
Vectors need Graphs!
Vectors need Graphs! Embedding vectors are a pivotal tool when using Generative AI. While vectors might initially seem an unlikely partner to graphs, their… | 61 comments on LinkedIn
Vectors need Graphs!
·linkedin.com·
Vectors need Graphs!