Pythia
Family: Pythia
Pretraining Architecture: Decoder
Extension: Trained with the library GPT-NeoX
Application: Research on language modelβs behavior, functionality, and limitations.
Date (of first known publication): 04/2023
Num. Params: 70M, 160M, 410M, 1B, 1.4B, 2.8B, 6.9B, 12B
Corpus: Pile
License: Open, Apache-2.0
Lab: Eleuther AI
Last updated