metadata
tags:
- generated_from_keras_callback
- music
model-index:
- name: juancopi81/mutopia_guitar_mmm
results: []
datasets:
- juancopi81/mutopia_guitar_dataset
widget:
- text: >-
PIECE_START TIME_SIGNATURE=4_4 BPM=90 TRACK_START INST=0 DENSITY=2
BAR_START NOTE_ON=60
example_title: Time signature 4/4, BPM=90
juancopi81/mutopia_guitar_mmm
This model is a fine-tuned version of gpt2 on the Mutopia Guitar Dataset. Use the widget to generate your piece and then use this notebook to hear it (work in progress). The notebook is adapted from the one created by Dr. Tristan Behrens.
It achieves the following results on the evaluation set:
- Train Loss: 0.7588
- Validation Loss: 1.3974
- Epoch: 2
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'class_name': 'WarmUp', 'config': {'initial_learning_rate': 5e-05, 'decay_schedule_fn': {'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 5e-05, 'decay_steps': 9089, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'passive_serialization': True}, 'warmup_steps': 1000, 'power': 1.0, 'name': None}}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: mixed_float16
Training results
Train Loss | Validation Loss | Epoch |
---|---|---|
1.0705 | 1.3590 | 0 |
0.8889 | 1.3702 | 1 |
0.7588 | 1.3974 | 2 |
Framework versions
- Transformers 4.21.3
- TensorFlow 2.8.2
- Datasets 2.4.0
- Tokenizers 0.12.1