NeMo
PyTorch
nemotron
srvm commited on
Commit
70fa599
1 Parent(s): c4b8f78

Update installation instructions

Browse files
Files changed (1) hide show
  1. README.md +2 -3
README.md CHANGED
@@ -48,11 +48,10 @@ It also uses Grouped-Query Attention (GQA) and Rotary Position Embeddings (RoPE)
48
 
49
  ## Usage
50
 
51
- The [pull request](https://github.com/huggingface/transformers/pull/32495) to support this model in Hugging Face Transformers is under review and is expected to be merged soon. In the meantime, please follow the installation instructions below:
52
 
53
  ```
54
- $ git clone -b aot/head_dim_rope --single-branch https://github.com/suiyoubi/transformers.git && cd transformers
55
- $ pip install -e .
56
  ```
57
 
58
  The following code provides an example of how to load the Minitron-8B model and use it to perform text generation.
 
48
 
49
  ## Usage
50
 
51
+ Support for this model will be added in the upcoming `transformers` release. In the meantime, please install the library from source:
52
 
53
  ```
54
+ pip install git+https://github.com/huggingface/transformers
 
55
  ```
56
 
57
  The following code provides an example of how to load the Minitron-8B model and use it to perform text generation.