Text Generation
Transformers
PyTorch
English
llama
text-generation-inference
Inference Endpoints
TheBloke's picture
Shouldn't CodeLlama 34B have 16K context and rope_theta 1M?
1a41a8c