Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
Official Implementation of OCR-free Document Understanding Transformer (Donut) and Synthetic Document Generator (SynthDoG), ECCV 2022
A LITE BERT FOR SELF-SUPERVISED LEARNING OF LANGUAGE REPRESENTATIONS, 海量中文预训练ALBERT模型
Language Understanding Evaluation benchmark for Chinese: datasets, baselines, pre-trained models,corpus and leaderboard
[CVPR 2021] Involution: Inverting the Inherence of Convolution for Visual Recognition, a brand new neural operator
[ICLR'23 Spotlight] The first successful BERT/MAE-style pretraining on any convolutional network; Pytorch impl. of "Designing BERT for Convolutional Networks: Sparse and Hierarchical Masked Modeling"
[MICCAI 2019] [MEDIA 2020] Models Genesis
An Open-sourced Knowledgable Large Language Model Framework.
PyTorch code for "Prototypical Contrastive Learning of Unsupervised Representations"
Official PyTorch implementation of FasterViT: Fast Vision Transformers with Hierarchical Attention
[ICML 2023] Official PyTorch implementation of Global Context Vision Transformers
Self-supervised contrastive learning for time series via time-frequency consistency
Get updates on the fastest growing repos and cool stats about GitHub right in your inbox
Once per month. No spam.