• Turing Post
  • Posts
  • 5 open-source large language models you can use in your project

5 open-source large language models you can use in your project

Papers and codes for BLOOMZ, OPT-ML, Pythia, LLaMA and Vicuna

We collected the links to all research papers and repositories with the code.

  • BLOOMZ is a resulting model after applying multitask prompted fine-tuning to the pre-trained multilingual BLOOM.

  • OPT-IML (OPT + Instruction Meta-Learning) is a set of instruction-tuned versions of OPT, on a collection of ~2000 NLP tasks gathered from 8 NLP benchmarks, called OPT-IML Bench.

  • Pythia is a suite of 16 LLMs, all trained on public data seen in the same order and ranging in size from 70M to 12B parameters.

  • LLaMA is a collection of foundation language models ranging from 7B to 65B parameters.

  • Vicuna is an open-source chatbot trained by fine-tuning LLaMA on user-shared conversations collected from ShareGPT.

Every day we post helpful lists and bite-sized explanations on our Twitter. Please join us there:

Join the conversation

or to participate.