Custom instructions for ChatGPT
ChatGPT can now remember who you are and what you want
Now, you can ask your chatbot a question without a paragraph-long preamble.
REGULATE ME | On the Media | WNYC Studios
Regulating AI; Old Hollywood Myths; Fixing the Internet
Introducing S-GPT, A Shortcut to Connect OpenAI’s ChatGPT with Native Features of Apple’s Operating Systems
It’s the inaugural week of the second annual edition of Automation April, and to celebrate the occasion, I’ve been working on something special: today, I’m introducing S-GPT, an advanced conversational shortcut for ChatGPT that bridges OpenAI’s assistant to native system features of iOS, iPadOS, macOS, and watchOS. S-GPT (which stands for Shortcuts-GPT) is free to
Introducing Micro.blog podcast transcripts
We’ve launched a new feature for Micro.blog Premium customers: automatic podcast episode transcripts, powered by OpenAI’s Whisper model. I’m excited about this because it’s one of the more practical, time-saving solutions coming out of the rise of AI. The automatic transcripts are so accurate they can be used as-is, or edited by hand as you have time.
I thought it would be clever to ask ChatGPT to write a blog post announcing this feature.
BuzzFeed Is Quietly Publishing Whole AI-Generated Articles, Not Just Quizzes
After announcing earlier this year a pivot to quizzes co-written by AI, BuzzFeed seems to have widened its purview to include articles.
Pretending to Teach — Minimalist EdTech
Inspired by and forked from kettle11's world builder prompt for ChatGPT, this is a bare bones adaptation to show how low can be the lift ...
The Problem With AI — Matt Gemmell
All technologies bring opportunity and threat, but the scales are rarely balanced.
Introducing Whisper
We’ve trained and are open-sourcing a neural net called Whisper that approaches human level robustness and accuracy on English speech recognition.
Read Paper
View Code
View Model Card
Whisper examples:
Reveal Transcript
Whisper is an automatic speech recognition (ASR) system trained on 680,000 hours of multilingual and multitask
An Empirical Model of Large-Batch Training
In an increasing number of domains it has been demonstrated that deep
learning models can be trained using relatively large batch sizes without
sacrificing data efficiency. However the limits of...
Teacher-Student Curriculum Learning
We propose Teacher-Student Curriculum Learning (TSCL), a framework for
automatic curriculum learning, where the Student tries to learn a complex task
and the Teacher automatically chooses subtasks...