🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
The official gpt4free repository | various collection of powerful language models
🐙 Guides, papers, lecture, notebooks and resources for prompt engineering
OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
Code and documentation to train Stanford's Alpaca models, and generate the data.
Enable everyone to develop, optimize and deploy AI models natively on everyone's devices.
:mag: LLM orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
RWKV is an RNN with transformer-level LLM performance. It can be directly trained like a GPT (parallelizable). So it's combining the best of RNN and transformer - great performance, fast inference, saves VRAM, fast training, "infinite" ctx_len, and free sentence embedding.
ChatRWKV is like ChatGPT but powered by RWKV (100% RNN) language model, and open source.
大规模中文自然语言处理语料 Large Scale Chinese Corpus for NLP
An implementation of model parallel GPT-2 and GPT-3-style models using the mesh-tensorflow library.
Bringing large-language models and chat to web browsers. Everything runs inside the browser with no server support.
An Extensible Toolkit for Finetuning and Inference of Large Foundation Models. Large Models for All.
🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
A PyTorch-based Speech Toolkit
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
Open Source Neural Machine Translation and (Large) Language Models in PyTorch
An open source implementation of CLIP.
Get updates on the fastest growing repos and cool stats about GitHub right in your inbox
Once per month. No spam.