Dolly

  • Family: GPT

  • Pretraining Architecture: Decoder

  • Fine-tuning Task: human instructions

  • Extension: fine-tuned based on the GPT-J-6B (V1) and Pythia model (V2)

  • Application: Similar to Alpaca

  • Date (of first known publication): 03/2023

  • Num. Params: V1: 6B, V2: 12B

  • Corpus: V1: Instruction corpus same as Alpaca, V2: databricks own dataset.

  • License: Open

  • Lab: Databricks, Inc

Last updated