LLMs

LLMs

312 bookmarks
Custom sorting
GraphRAG: A new approach for discovery using complex information
GraphRAG: A new approach for discovery using complex information
Microsoft is transforming retrieval-augmented generation with GraphRAG, using LLM-generated knowledge graphs to significantly improve Q&A when analyzing complex information and consistently outperforming baseline RAG. Get the details.
·microsoft.com·
GraphRAG: A new approach for discovery using complex information
Creating a GPT Assistant That Writes Pipeline Tests
Creating a GPT Assistant That Writes Pipeline Tests
For rapid prototyping of LLM-backed tools, GPT Creator is a dream come true, says Jon Udell. He shows how he made — and used — a custom GPT.
·thenewstack.io·
Creating a GPT Assistant That Writes Pipeline Tests
What is Generative AI and LLMs, really?
What is Generative AI and LLMs, really?
First part of a series of blog posts that dive deep into what AI and LLMs really are, and why you should care.
·thoughtbot.com·
What is Generative AI and LLMs, really?
TIL: Sum Types With instructor_ex
TIL: Sum Types With instructor_ex
The Instructor Elixir library lets you retrieve structured output from LLMs like the OpenAI GPT models.
·samrat.me·
TIL: Sum Types With instructor_ex
Learn by Doing: How LLMs Should Reshape Education
Learn by Doing: How LLMs Should Reshape Education
The path toward hands-on autonomous learning is through large language models (LLMs). Jon Udell shows how the EdTech sector can use AI.
·thenewstack.io·
Learn by Doing: How LLMs Should Reshape Education
Getting started with Ollama with Microsoft's Phi-2
Getting started with Ollama with Microsoft's Phi-2
It seems that each week brings a dozen new generative AI-based tools and services. Many are wrappers to ChatGPT (or the underlying LLMs such as GPT 3.5 Turbo), while some bring much more. Ollama is one of the latter, and it's amazing. In this blog post, I'll briefly examine
·jussiroine.com·
Getting started with Ollama with Microsoft's Phi-2
The Tradeoffs in AI
The Tradeoffs in AI
AI will improve in accuracy, creativity and speed. Even LLMs. But there seems to be an inherent tradeoff between these attributes, so that you can’t optimize more than two of them at once. To be clear, we will have AIs … Continue reading →
·kk.org·
The Tradeoffs in AI
The Million Dollar Matrix Multiply
The Million Dollar Matrix Multiply
More efficient matrix multiplication algorithms could save millions of dollars on computationally intense tasks such as training LLMs.
·johndcook.com·
The Million Dollar Matrix Multiply
The future of LLMs is local
The future of LLMs is local
I saw this post over on json.blog, OpenAI Is Just Uber, which sounds like it makes sense, but I think is drawing the wrong conclusions on a couple fronts. A decade later, ride-sharing hasn’t evolved significantly since its launch. Costs have risen as consumers now pay the actual
·birchtree.me·
The future of LLMs is local
Retrieval Augmented Generation for LLMs
Retrieval Augmented Generation for LLMs
Retrieval-augmented generation (RAG) is a cutting-edge approach in NLP and AI. Badrul Sarwar, a machine learning scientist, shares his tips.
·thenewstack.io·
Retrieval Augmented Generation for LLMs
No Robots(.txt): How to Ask ChatGPT and Google Bard to Not Use Your Website for Training
No Robots(.txt): How to Ask ChatGPT and Google Bard to Not Use Your Website for Training
Both OpenAI and Google have released guidance for website owners who do not want the two companies using the content of their sites to train the company's large language models (LLMs). We've long been supporters of the right to scrape websites—the process of using a computer to load and read pages...
·eff.org·
No Robots(.txt): How to Ask ChatGPT and Google Bard to Not Use Your Website for Training
Top 5 AI Engineering Trends of 2023
Top 5 AI Engineering Trends of 2023
As well as the proliferation of LLMs, there has been an expansion of dev tools for AI. We look at five key trends this year in AI development.
·thenewstack.io·
Top 5 AI Engineering Trends of 2023