|
--- |
|
dataset_info: |
|
- config_name: high_32 |
|
features: |
|
- name: label |
|
dtype: int64 |
|
- name: name |
|
dtype: int64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
splits: |
|
- name: train |
|
num_bytes: 12755204000 |
|
num_examples: 250 |
|
download_size: 10785191542 |
|
dataset_size: 12755204000 |
|
- config_name: low_16 |
|
features: |
|
- name: label |
|
dtype: int64 |
|
- name: name |
|
dtype: string |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
splits: |
|
- name: train |
|
num_bytes: 7653281334 |
|
num_examples: 300 |
|
download_size: 6486055669 |
|
dataset_size: 7653281334 |
|
- config_name: low_32 |
|
features: |
|
- name: label |
|
dtype: int64 |
|
- name: name |
|
dtype: string |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
splits: |
|
- name: train |
|
num_bytes: 15306247734 |
|
num_examples: 300 |
|
download_size: 12969592556 |
|
dataset_size: 15306247734 |
|
- config_name: low_8 |
|
features: |
|
- name: label |
|
dtype: int64 |
|
- name: name |
|
dtype: string |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
splits: |
|
- name: train |
|
num_bytes: 3826798134 |
|
num_examples: 300 |
|
download_size: 3495989797 |
|
dataset_size: 3826798134 |
|
- config_name: medium_16 |
|
features: |
|
- name: label |
|
dtype: int64 |
|
- name: name |
|
dtype: int64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
splits: |
|
- name: train |
|
num_bytes: 7653278400 |
|
num_examples: 300 |
|
download_size: 6486159448 |
|
dataset_size: 7653278400 |
|
- config_name: medium_32 |
|
features: |
|
- name: label |
|
dtype: int64 |
|
- name: name |
|
dtype: int64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
splits: |
|
- name: train |
|
num_bytes: 15306244800 |
|
num_examples: 300 |
|
download_size: 12972383269 |
|
dataset_size: 15306244800 |
|
- config_name: medium_32_2 |
|
features: |
|
- name: label |
|
dtype: int64 |
|
- name: name |
|
dtype: int64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.0.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.down_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: unet.mid_block.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.1.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.2.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.0.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.1.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn1.to_v.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_k.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_k.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_out.0.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_q.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_q.lora.up.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_v.lora.down.weight |
|
sequence: float64 |
|
- name: >- |
|
unet.up_blocks.3.attentions.2.transformer_blocks.0.attn2.to_v.lora.up.weight |
|
sequence: float64 |
|
splits: |
|
- name: train |
|
num_bytes: 15935390400 |
|
num_examples: 300 |
|
download_size: 14466889727 |
|
dataset_size: 15935390400 |
|
configs: |
|
- config_name: high_32 |
|
data_files: |
|
- split: train |
|
path: high/32/train-* |
|
- config_name: low_16 |
|
data_files: |
|
- split: train |
|
path: low/16/train-* |
|
- config_name: low_32 |
|
data_files: |
|
- split: train |
|
path: low/32/train-* |
|
- config_name: low_8 |
|
data_files: |
|
- split: train |
|
path: low/8/train-* |
|
- config_name: medium_16 |
|
data_files: |
|
- split: train |
|
path: medium/16/train-* |
|
- config_name: medium_32 |
|
data_files: |
|
- split: train |
|
path: medium/32/train-* |
|
- config_name: medium_32_2 |
|
data_files: |
|
- split: train |
|
path: medium/32_2/train-* |
|
task_categories: |
|
- tabular-classification |
|
- tabular-regression |
|
size_categories: |
|
- 1K<n<10K |
|
--- |
|
# Dataset Card for the LoRA WiSE benchmark |
|
The **LoRA Weight Size Evaluation (LoRA-WiSE)** is a comprehensive |
|
benchmark specifically designed to evaluate LoRA dataset size recovery methods for generative models |
|
LoRA-WiSE spans various dataset sizes, backbones, ranks, and personalization sets, as presented in |
|
the "Dataset Size Recovery from LoRA Weights" [paper](https://arxiv.org/abs/2406.19395). |
|
|
|
|
|
- [Task Details](#task-details) |
|
- [Dataset Description](#dataset-description) |
|
- [Dataset Structure](#dataset-structure) |
|
- [Data Subsets](#data-subsets) |
|
- [Data Fields](#data-fields) |
|
- [Dataset Creation](#dataset-creation) |
|
- [Citation Information](#citation-information) |
|
|
|
|
|
- **π Homepage:** |
|
https://vision.huji.ac.il/dsire/ |
|
- **π§βπ» Repository:** |
|
https://github.com/MoSalama98/dsire |
|
- **π Paper:** |
|
https://arxiv.org/abs/2406.19395 |
|
- **βοΈ Point of Contact:** |
|
mohammad.salama3@mail.huji.ac.il |
|
|
|
## Task Details |
|
**Dataset Size Recovery Setting:** We introduce the task dataset size recovery, that aims to determine the number of samples used to train a model, directly from its weights. |
|
The setting for the task is as follows: |
|
- The user has access to n different LoRA fine-tuned models, each annotated with its dataset size. |
|
- It is assumed that all n models originated from the same source model and were trained with identical parameters. |
|
- Using only these n observed models, the goal is to predict the dataset size for new models that |
|
are trained under the same parameters. |
|
|
|
Our method, DSiRe, addresses this task, focusing particularly on the important special case of recovering the number |
|
of images used to fine-tune a model, where fine-tuning was performed via LoRA. DSiRe demonstrates high accuracy |
|
in this context, achieving reliable results with just 5 models per dataset size category. |
|
|
|
## Dataset Description |
|
We present **the LoRA Weight Size Evaluation (LoRA-WiSE)** benchmark. More specifically, it features the weights of 2050 Stable Diffusion models, which were |
|
fine-tuned by a standard, popular protocol of dreambooth via LoRA. Our benchmark includes stable diffusion version 1.5 and version 2, having 1750 and 300 fine-tuned models for each version respectively. |
|
We fine-tune the models using three different ranges of dataset size: |
|
- Low data range: 1-6 images. |
|
- Medium data range: 1-50 images. |
|
- High data range: 1-1000 images. |
|
|
|
For each range, we use a discrete set of fine-tuning dataset sizes. In the low and medium ranges, |
|
we also provide other versions of these benchmarks with different LoRA ranks and backbones. |
|
See [Data Subsets](#data-subsets) for the precise benchmark details. |
|
|
|
## Dataset Structure |
|
The dataset contains seven subsets, each comprising 250-300 LoRA fine-tuned models. Each row in the dataset represents a single fine-tuned model, containing all the necessary information for recovery and numerical evaluation. |
|
|
|
Specifically, each sample's dataset row corresponds to a single fine-tuned model with 256 layers, adding two new columns: "label" and "name." The "label" indicates the number of samples used for the dataset size of the |
|
fine-tuned models, while the "name" denotes the name of the micro-dataset. |
|
|
|
We decided to provide the LoRA layers' weights (adaptive weights) instead of the full model for two reasons: |
|
|
|
- Providing the LoRA weights significantly reduces the storage size of the dataset. |
|
- Offering the LoRA weights enables users to study the properties of the fine-tuned LoRA layers, which may aid in developing new methods. |
|
|
|
#### Data Subsets |
|
The table below describes the dataset subsets in detail: |
|
|
|
| Subset Name | Dataset Sizes (labels) | Source | Backbone | # Of Models | # LoRA Rank | |
|
|:-----------:|:-------------------------:|:-----------:|:--------:|:-----------:|:-----------:| |
|
| high_32 | [1, 10, 100, 500, 1000] | ImageNet | SD 1.5 | 250 | 32 | |
|
| medium_32_2 | [1, 10, 20, 30, 40, 50] | ImageNet | SD 2 | 300 | 32 | |
|
| medium_32 | [1, 10, 20, 30, 40, 50] | ImageNet | SD 1.5 | 300 | 32 | |
|
| medium_16 | [1, 10, 20, 30, 40, 50] | ImageNet | SD 1.5 | 300 | 16 | |
|
| low_32 | [1, 2, 3, 4, 5, 6] | Concepts101 | SD 1.5 | 300 | 32 | |
|
| low_16 | [1, 2, 3, 4, 5, 6] | Concepts101 | SD 1.5 | 300 | 16 | |
|
| low_8 | [1, 2, 3, 4, 5, 6] | Concepts101 | SD 1.5 | 300 | 8 | |
|
|
|
#### Data Fields |
|
As described above, each row of the dataset represents a single fine-tuned model that should be recovered and contains the following fields: |
|
- name - The name of the micro-dataset that the model was fine-tuned on. |
|
- label - the number of images used for the fine-tuned model |
|
- lora_{lora_name}_A_weight - The LoRA A weight matrix of the LoRA fine-tuned models layer. |
|
- lora_{lora_name}_B_weight - The LoRA B weight matrix of the LoRA fine-tuned models layer. |
|
|
|
where `{lora_name}` is the name of the layer of the LoRA fine-tuned model in the subset. |
|
|
|
**Note**: You can find the images in the "files and versions" section under the folder named "images." |
|
|
|
|
|
## Dataset Creation |
|
- The fine-tuning of the the models was performed using the [PEFT](https://huggingface.co/docs/peft/en/index) library |
|
on [Concept101](https://www.cs.cmu.edu/~custom-diffusion/https://www.cs.cmu.edu/~custom-diffusion/) and [ImageNet](https://www.image-net.org/) datasets. |
|
|
|
For the full list of models and hyper-parameters see the appendix of the "Dataset Size Recovery from LoRA Weights" [paper](https://arxiv.org/abs/2406.19395). |
|
|
|
## Citation Information |
|
If you use this dataset in your work please cite the following paper: |
|
|
|
**BibTeX:** |
|
``` |
|
@article{salama2024dataset, |
|
title={Dataset Size Recovery from LoRA Weights}, |
|
author={Salama, Mohammad and Kahana, Jonathan and Horwitz, Eliahu and Hoshen, Yedid}, |
|
journal={arXiv preprint arXiv:2406.19395}, |
|
year={2024} |
|
} |
|
``` |