ccore's picture
Update README.md
bb2380d
|
raw
history blame
No virus
1.56 kB
---
license: other
datasets:
- Open-Orca/OpenOrca
- ehartford/wizard_vicuna_70k_unfiltered
tags:
- code
- prompt
- reverse prompt
widget:
- text: "Photosynthesis is the process by which plants, algae and some bacteria convert carbon dioxide and water into glucose and oxygen, using the energy of sunlight. This process is fundamental to life on Earth, as it provides the basis for almost all food chains and also contributes to the carbon cycle by helping to regulate the concentration of carbon dioxide in the atmosphere. \n[REVERSED-PROMPT]"
example_title: "reverse prompt"
---
# PREVIEW - training will end 4/9
commit a87a7a188022bec44cffcb3ae9c250b8bacf7dd3 seems to be more stable than the lasts commits, the next one I will post only at 6/9
# core-prompt-reverser-opt-1.3b
This model is a fine-tuned version of facebook/opt-1.3b on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2950
- Accuracy: 0.7084
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1.0
### Training results
### Framework versions
- Transformers 4.33.0.dev0
- Pytorch 2.1.0.dev20230605+cu121
- Datasets 2.14.4
- Tokenizers 0.13.3