Text Generation
Transformers
PyTorch
bloom
text-generation-inference
Inference Endpoints
Muennighoff commited on
Commit
fa03bb6
1 Parent(s): 632e89d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -70,7 +70,7 @@ We use `git tags` to load a model in a specific version (eg. `global_step1000`):
70
  ```python
71
  from transformers import AutoModelForCausalLM
72
  model = AutoModelForCausalLM.from_pretrained(
73
- "bigscience/bloom-750m-intermediate",
74
  revision="global_step1000",
75
  torch_dtype="auto",
76
  )
 
70
  ```python
71
  from transformers import AutoModelForCausalLM
72
  model = AutoModelForCausalLM.from_pretrained(
73
+ "bigscience/bloom-760m-intermediate",
74
  revision="global_step1000",
75
  torch_dtype="auto",
76
  )