PaLM 2
No public details of the different elements needed to classify and understand them even at the most basic level
Family: Transformer
Pretraining Architecture: Decoder
Pretraining Task: LM
Extension:
Application: PalM is designed as a general purpose language model with applicability to hundreds of different language tasks
Date (of first known publication): 05/2023
Num. Params:
Corpus:
License: Closed source, Accessible through API
Lab: Google
Last updated