🧑🏫 60 Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, sophia, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Implementation of Vision Transformer, a simple way to achieve SOTA in vision classification with only a single transformer encoder, in Pytorch
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
🍀 Pytorch implementation of various Attention Mechanisms, MLP, Re-parameter, Convolution, which is helpful to further understand papers.⭐⭐⭐
《李宏毅深度学习教程》,PDF下载地址:https://github.com/datawhalechina/leedl-tutorial/releases
A PyTorch implementation of the Transformer model in "Attention is All You Need".
Implementation of RLHF (Reinforcement Learning with Human Feedback) on top of the PaLM architecture. Basically ChatGPT but with PaLM
Implementation / replication of DALL-E, OpenAI's Text to Image Transformer, in Pytorch
Tutorials on implementing a few sequence-to-sequence (seq2seq) models with PyTorch and TorchText.
A collection of important graph embedding, classification and representation learning papers with implementations.
pycorrector is a toolkit for text error correction. 文本纠错,Kenlm,ConvSeq2Seq,BERT,MacBERT,ELECTRA,ERNIE,Transformer,T5等模型实现,开箱即用。
The official repo of Qwen-7B (通义千问-7B) chat & pretrained large language model proposed by Alibaba Cloud.
A simple but complete full-attention transformer with a set of promising experimental features from various papers
An ultimately comprehensive paper list of Vision Transformer/Attention, including papers, codes, and related websites
中文LLaMA-2 & Alpaca-2大模型二期项目 + 16K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs, including 16K long context models)
Get updates on the fastest growing repos and cool stats about GitHub right in your inbox
Once per month. No spam.