Found 28 bookmarks
Custom sorting
A Simple Guide To Retrieval Augmented Generation Language Models — Smashing Magazine
A Simple Guide To Retrieval Augmented Generation Language Models — Smashing Magazine
Language models have shown impressive capabilities. But that doesn’t mean they’re without faults, as anyone who has witnessed a ChatGPT “hallucination” can attest. In this article, Joas Pambou diagnoses the symptoms that cause hallucinations and explains not only what RAG is but also different approaches for using it to solve language model limitations.
·smashingmagazine.com·
A Simple Guide To Retrieval Augmented Generation Language Models — Smashing Magazine
LangGraph Crash Course with code examples
LangGraph Crash Course with code examples
Colab 01. Learning LangGraph Agent Executor: https://drp.li/vL1J9 Colab 02. Learning LangGraph - Chat Executor: https://drp.li/HAz3o Colab 03. Learning LangGraph - Agent Supervisor: https://drp.li/xvEwd Interested in building LLM Agents? Fill out the form below Building LLM Agents Form: https://drp.li/dIMes Github: https://github.com/samwit/langchain-tutorials (updated) https://github.com/samwit/llm-tutorials Time Stamps: 00:00 Intro 00:19 What is LangGraph? 00:26 LangGraph Blog 01:38 StateGraph 02:16 Nodes 02:42 Edges 03:48 Compiling the Graph 05:23 Code Time 05:34 Agent with new create_open_ai 21:37 Chat Executor 27:00 Agent Supervisor
·youtube.com·
LangGraph Crash Course with code examples
The Narrated Transformer Language Model
The Narrated Transformer Language Model
AI/ML has been witnessing a rapid acceleration in model improvement in the last few years. The majority of the state-of-the-art models in the field are based on the Transformer architecture. Examples include models like BERT (which when applied to Google Search, resulted in what Google calls "one of the biggest leaps forward in the history of Search") and OpenAI's GPT2 and GPT3 (which are able to generate coherent text and essays). This video by the author of the popular "Illustrated Transformer" guide will introduce the Transformer architecture and its various applications. This is a visual presentation accessible to people with various levels of ML experience. Intro (0:00) The Architecture of the Transformer (4:18) Model Training (7:11) Transformer LM Component 1: FFNN (10:01) Transformer LM Component 2: Self-Attention(12:27) Tokenization: Words to Token Ids (14:59) Embedding: Breathe meaning into tokens (19:42) Projecting the Output: Turning Computation into Language (24:11) Final Note: Visualizing Probabilities (25:51) The Illustrated Transformer: https://jalammar.github.io/illustrated-transformer/ Simple transformer language model notebook: https://github.com/jalammar/jalammar.github.io/blob/master/notebooks/Simple_Transformer_Language_Model.ipynb Philosophers On GPT-3 (updated with replies by GPT-3): https://dailynous.com/2020/07/30/philosophers-gpt-3/ ----- Twitter: https://twitter.com/JayAlammar Blog: https://jalammar.github.io/ Mailing List: https://jayalammar.substack.com/ More videos by Jay: Jay's Visual Intro to AI https://www.youtube.com/watch?v=mSTCzNgDJy4 How GPT-3 Works - Easily Explained with Animations https://www.youtube.com/watch?v=MQnJZuBGmSQ
·youtube.com·
The Narrated Transformer Language Model
Deep Learning - Foundations and Concepts
Deep Learning - Foundations and Concepts
This book offers a comprehensive introduction to the central ideas that underpin deep learning. It is intended both for newcomers to machine learning and for those already experienced in the field.
·bishopbook.com·
Deep Learning - Foundations and Concepts
MIT MAS.S68!
MIT MAS.S68!
Generative AI for Constructive Communication Course
·ai4comm.media.mit.edu·
MIT MAS.S68!
ChatGPT Prompt Engineering for Developers
ChatGPT Prompt Engineering for Developers
What you’ll learn in this course In ChatGPT Prompt Engineering for Developers, you will learn how to use a large language model (LLM) to quickly build new and powerful applications.  Using the OpenAI API, you’ll...
·deeplearning.ai·
ChatGPT Prompt Engineering for Developers