Cost

While the cost of traditional MLOps usually lies in data collection and model training, the cost of LLMOps lies in inference. Although we can expect some costs from using expensive APIs during experimentation, Chip Huyen shows that the cost of long prompts is in inference.

Last updated