Text Generation
Transformers
Safetensors
llama
text-generation-inference
unsloth
conversational
Eval Results
Inference Endpoints
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -161,7 +161,7 @@ Although Replete-Coder has amazing coding capabilities, its trained on vaste amo
161
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/642cc1c253e76b4c2286c58e/-0dERC793D9XeFsJ9uHbx.png)
162
 
163
  Thank you to TensorDock for sponsoring Replete-Coder-llama3-8b and Replete-Coder-Qwen2-1.5b
164
- you can check out their website for cloud compute rental bellow.
165
  - https://tensordock.com
166
  __________________________________________________________________________________________________
167
  Replete-Coder-llama3-8b is a general purpose model that is specially trained in coding in over 100 coding languages. The data used to train the model contains 25% non-code instruction data and 75% coding instruction data totaling up to 3.9 million lines, roughly 1 billion tokens, or 7.27gb of instruct data. The data used to train this model was 100% uncensored, then fully deduplicated, before training happened.
 
161
  ![image/png](https://cdn-uploads.huggingface.co/production/uploads/642cc1c253e76b4c2286c58e/-0dERC793D9XeFsJ9uHbx.png)
162
 
163
  Thank you to TensorDock for sponsoring Replete-Coder-llama3-8b and Replete-Coder-Qwen2-1.5b
164
+ you can check out their website for cloud compute rental below.
165
  - https://tensordock.com
166
  __________________________________________________________________________________________________
167
  Replete-Coder-llama3-8b is a general purpose model that is specially trained in coding in over 100 coding languages. The data used to train the model contains 25% non-code instruction data and 75% coding instruction data totaling up to 3.9 million lines, roughly 1 billion tokens, or 7.27gb of instruct data. The data used to train this model was 100% uncensored, then fully deduplicated, before training happened.