Machine Learning & Artificial Intelligence

Machine Learning & Artificial Intelligence

378 bookmarks
Custom sorting
GitHub - qiuyu96/CoDeF: Official PyTorch implementation of CoDeF: Content Deformation Fields for Temporally Consistent Video Processing
GitHub - qiuyu96/CoDeF: Official PyTorch implementation of CoDeF: Content Deformation Fields for Temporally Consistent Video Processing
Official PyTorch implementation of CoDeF: Content Deformation Fields for Temporally Consistent Video Processing - GitHub - qiuyu96/CoDeF: Official PyTorch implementation of CoDeF: Content Deformati...
·github.com·
GitHub - qiuyu96/CoDeF: Official PyTorch implementation of CoDeF: Content Deformation Fields for Temporally Consistent Video Processing
What are Large Language Models (LLMs)?
What are Large Language Models (LLMs)?
In this article, we will understand the concept of Large Language Models (LLMs) and their importance in natural language processing.
·analyticsvidhya.com·
What are Large Language Models (LLMs)?
XLNet
XLNet
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
·huggingface.co·
XLNet
DistilBERT
DistilBERT
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
·huggingface.co·
DistilBERT
XLM-RoBERTa
XLM-RoBERTa
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
·huggingface.co·
XLM-RoBERTa
RoBERTa: An optimized method for pretraining self-supervised NLP systems
RoBERTa: An optimized method for pretraining self-supervised NLP systems
Facebook AI’s RoBERTa is a new training recipe that improves on BERT, Google’s self-supervised method for pretraining natural language processing systems. By training longer, on more data, and dropping BERT’s next-sentence prediction RoBERTa topped the GLUE leaderboard.
·ai.meta.com·
RoBERTa: An optimized method for pretraining self-supervised NLP systems
AI21 Studio
AI21 Studio
A powerful language model, with an API that makes you smile
·ai21.com·
AI21 Studio
GPT-NeoX
GPT-NeoX
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
·huggingface.co·
GPT-NeoX
GPT-J
GPT-J
We’re on a journey to advance and democratize artificial intelligence through open source and open science.
·huggingface.co·
GPT-J
Product
Product
Our API platform offers our latest models and guides for safety best practices.
·openai.com·
Product
Meta Open-Sources 175 Billion Parameter AI Language Model OPT
Meta Open-Sources 175 Billion Parameter AI Language Model OPT
Meta AI Research released Open Pre-trained Transformer (OPT-175B), a 175B parameter AI language model. The model was trained on a dataset containing 180B tokens and exhibits performance comparable with GPT-3, while only requiring 1/7th GPT-3's training carbon footprint.
·infoq.com·
Meta Open-Sources 175 Billion Parameter AI Language Model OPT