‘I will never go back’: Ontario family doctor says new AI notetaking saved her job | Globalnews.ca
Ontario is piloting artificial intelligence software to help doctors take notes and reduce the paperwork they have to do. One doctor says it saved her career.
GitHub - truefoundry/cognita: RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry - GitHub - truefoundry/cognita: RAG (Retrieval Augmented Generation) Fra...
Today's big model release was Snowflake Arctic, an enormous 480B model with a 128×3.66B MoE (Mixture of Experts) architecture. It's Apache 2 licensed and Snowflake state that "in addition, we …
Something exceptionally grim is happening on the Internet.
In the last few months, the constant flood of algorithmically generated junk content has kicked into an AI-powered overdrive, and it is cutting a swath of destruction as it overwhelms search engines, filters, and moderation systems
Call it Gresham's Law 2.0: bad content drives out good.
I'm starting this thread to document it, because there is a *lot* happening all at once.
#greshamslaw20
Large language models may have big context windows, but they still aren't good enough at using the information in big contexts, especially in high value use-cases.
Apple released something big three hours ago, and I'm still trying to get my head around exactly what it is. The parent project is called CoreNet, described as "A library …
The Instruction Hierarchy: Training LLMs to Prioritize Privileged Instructions
By far the most detailed paper on prompt injection I've seen yet from OpenAI, published a few days ago and with six credited authors: Eric Wallace, Kai Xiao, Reimar Leike, …
We introduce phi-3-mini, a 3.8 billion parameter language model trained on 3.3 trillion tokens, whose overall performance, as measured by both academic benchmarks and internal testing, rivals that of models …
Discussions:
Hacker News (347 points, 37 comments), Reddit r/MachineLearning (151 points, 19 comments)
Translations: Chinese (Simplified), French, Korean, Portuguese, Russian
“There is in all things a pattern that is part of our universe. It has symmetry, elegance, and grace - those qualities you find always in that which the true artist captures. You can find it in the turning of the seasons, in the way sand trails along a ridge, in the branch clusters of the creosote
bush or the pattern of its leaves.
We try to copy these patterns in our lives and our society,
seeking the rhythms, the dances, the forms that comfort.
Yet, it is possible to see peril in the finding of
ultimate perfection. It is clear that the ultimate
pattern contains it own fixity. In such
perfection, all things move toward death.”
~ Dune (1965)
I find the concept of embeddings to be one of the most fascinating ideas in machine learning. If you’ve ever used Siri, Google Assistant, Alexa, Google Translate, or even smartphone keyboard with next-word prediction, then chances are you’ve benefitted from this idea that has become central to Natural Language Processing models. There has been quite a development over the last couple of decades in using embeddings for neural models (Recent developments include contextualized word embeddings leading to cutting-edge models like BERT and GPT2).
Word2vec is a method to efficiently create word embeddings and has been around since 2013. But in addition to its utility as a word-embedding method, some of its concepts have been shown to be effective in creating recommendation engines and making sense of sequential data even in commercial, non-language tasks. Companies like Airbnb, Alibaba, Spotify, and Anghami have all benefitted from carving out this brilliant piece of machinery from the world of NLP and using it in production to empower a new breed of recommendation engines.
In this post, we’ll go over the concept of embedding, and the mechanics of generating embeddings with word2vec. But let’s start with an example to get familiar with using vectors to represent things. Did you know that a list of five numbers (a vector) can represent so much about your personality?