Qubitium commited on
Commit
8ba7c19
1 Parent(s): 26db1b2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -18
README.md CHANGED
@@ -1,26 +1,16 @@
1
  ---
2
- extra_gated_heading: You need to share contact information with Databricks to access this model
3
- extra_gated_prompt: >-
4
-
5
- ### DBRX Terms of Use
6
-
7
- Use of DBRX is governed by the [Databricks Open Model License](https://www.databricks.com/legal/open-model-license) and the [Databricks Open Model Acceptable Use Policy](https://www.databricks.com/legal/acceptable-use-policy-open-model).
8
-
9
- extra_gated_fields:
10
- First Name: text
11
- Last Name: text
12
- Organization: text
13
- Purpose for Base Model Access: text
14
- By clicking 'Submit' below, I accept the terms of the license and acknowledge that the information I provide will be collected, stored, processed, and shared in accordance with Databricks' Privacy Notice and I understand I can update my preferences at any time: checkbox
15
- extra_gated_description: >-
16
- The information you provide will be collected, stored, processed, and shared in accordance with Databricks [Privacy Notice](https://www.databricks.com/legal/privacynotice).
17
- extra_gated_button_content: Submit
18
- inference: false
19
  license: other
20
  license_name: databricks-open-model-license
21
  license_link: https://www.databricks.com/legal/open-model-license
 
 
 
 
 
22
  ---
23
 
 
 
24
  # DBRX Base
25
 
26
  * DBRX Base is a mixture-of-experts (MoE) large language model trained from scratch by Databricks.
@@ -169,4 +159,4 @@ Full evaluation details can be found in our [technical blog post](https://www.da
169
  ## Acknowledgements
170
  The DBRX models were made possible thanks in large part to the open-source community, especially:
171
  * The [MegaBlocks](https://arxiv.org/abs/2211.15841) library, which established a foundation for our MoE implementation.
172
- * [PyTorch FSDP](https://arxiv.org/abs/2304.11277), which we built on for distributed training.
 
1
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2
  license: other
3
  license_name: databricks-open-model-license
4
  license_link: https://www.databricks.com/legal/open-model-license
5
+ tags:
6
+ - dbrx
7
+ - gptq
8
+ - '4bit '
9
+ - gptqmodel
10
  ---
11
 
12
+ This model has been quantized using [GPTQModel](https://github.com/ModelCloud/GPTQModel).
13
+
14
  # DBRX Base
15
 
16
  * DBRX Base is a mixture-of-experts (MoE) large language model trained from scratch by Databricks.
 
159
  ## Acknowledgements
160
  The DBRX models were made possible thanks in large part to the open-source community, especially:
161
  * The [MegaBlocks](https://arxiv.org/abs/2211.15841) library, which established a foundation for our MoE implementation.
162
+ * [PyTorch FSDP](https://arxiv.org/abs/2304.11277), which we built on for distributed training.