- Turing Post
- Posts
- 5 open-source LLMs #2
5 open-source LLMs #2
We collected the links to all research papers and repositories with the code.
NLLB is a series of open-source models capable of delivering high-quality translations directly between any pair of 200+ languages
GLM-130B is an open bilingual (English & Chinese) bidirectional dense model with 130 billion parameters.
RWKV is an RNN with Transformer-level LLM performance, which can also be directly trained like a GPT transformer (parallelizable).
Flan-T5
Galactica is a general-purpose scientific language model. It is trained on a large corpus of scientific text and data.
Every day we post helpful lists and bite-sized explanations on our Twitter. Please join us there!
5 open-source LLMs (save the list)
1. NLLB
2. GLM-130B
3. RWKV (RNN with Transformer-level performance)
4. Flan-T5
5. GalaticaAll repos in ๐งต
โ TuringPost (@TheTuringPost)
1:03 PM โข Apr 12, 2023
Reply