Distinguishing academic science writing from humans or ChatGPT with over 99% accuracy using off-the-shelf machine learning tools
The Curse of Recursion: Training on Generated Data Makes Models Forget
A transformer-based representation-learning model with unified processing of multimodal input for clinical diagnostics
LLaVA-Med: Training a Large Language-and-Vision Assistant for Biomedicine in One Day
FrugalGPT: How to Use Large Language Models While Reducing Cost and Improving Performance
DNA-GPT: Divergent N-Gram Analysis for Training-Free Detection of GPT-Generated Text
Recognize Anything: A Strong Image Tagging Model
Gong, Y., Rouditchenko, A., Liu, A. H., Harwath, D., Karlinsky, L., Kuehne, H., & Glass, J. (2022). Contrastive audio-visual masked autoencoder. arXiv preprint arXiv:2210.07839.
Advances in apparent conceptual physics reasoning in GPT-4
Cross-Lingual Supervision improves Large Language Models Pre-training
Deliberate then Generate: Enhanced Prompting Framework for Text Generation
Developing a platform for linear mechanical quantum computing
Simple and Controllable Music Generation
Entailment as Robust Self-Learner
Distinguishing academic science writing from humans or ChatGPT with over 99% accuracy using off-the-shelf machine learning tools
LEACE: Perfect linear concept erasure in closed form
Explainable Goal-driven Agents and Robots - A Comprehensive Review | ACM Computing Surveys
CAPE: Camera View Position Embedding for Multi-View 3D Object Detection
Faster sorting algorithms discovered using deep reinforcement learning
Emotion prediction as computation over a generative theory of mind | Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences
Enabling Scalable AI Computational Lithography with Physics-Inspired Models | Research
Enabling_Scalable_AI_Computational_Lithography_with_Physics-Inspired_Models.pdf
Faith and Fate: Limits of Transformers on Compositionality
Schitt, L. (2021). Mapping global AI governance: a nascent regime in a fragmented landscape
Feldstein, S. (2019). How Artificial Intelligence is Reshaping Repression
CodeTF: One-stop Transformer Library for State-of-the-art Code LLM
Bytes Are All You Need: Transformers Operating Directly On File Bytes
PDF
Fine-Tuning Language Models with Just Forward Passes
PDF
The Impact of Positional Encoding on Length Generalization in Transformers
PDF
SQL-PaLM: Improved Large Language ModelAdaptation for Text-to-SQL
PDF
MERT: Acoustic Music Understanding Model with Large-Scale Self-supervised Training
PDF