File size: 2,965 Bytes
3c0fc42
 
 
 
 
 
 
 
906f3de
3c0fc42
906f3de
 
3c0fc42
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
This folder contains models trained for the following 4 couplings

1. Suletta and Miorine from Suisei no Majo
2. Maple and Sally from Bofuri
3. Nozomi and Mizore from Sound Euphonium
4. Eila and Sanya from Strike witches

You can prompt them with above names except that for Maple and Sally you should use BoMaple and BoSally.
You can naturally get the trained couplings but it is also possible to pair characters from different series with either the native trained model (above) or lora (below):

![native-00008-4163909665](https://huggingface.co/alea31415/YuriDiffusion/resolve/main/suremio-nozomizo-eilanya-maplesally/samples/native-00008-4163909665.png)
![lora-00017-691849602](https://huggingface.co/alea31415/YuriDiffusion/resolve/main/suremio-nozomizo-eilanya-maplesally/samples/lora-00017-691849602.png)


### Dataset

Total size 1394

Suremio 285
- Suletta: 81
- Miorine: 85
- Suremio: 119

MapleSally 235
- Maple: 67
- Sally: 22
- MapleSally 33
Augmented with face cropping

Nozomizo 272
- Mizore: 81
- Nozomi: 81
- Nozomizo: 110

Eilanya 326
- Eila 67
- Sanya: 78
- Eilanya: 181

Regularization 276


### Base model

[NMFSAN](https://huggingface.co/Crosstyan/BPModel/blob/main/NMFSAN/README.md) so you can have different styles

### Native training

Trained with [Kohya trainer](https://github.com/Linaqruf/kohya-trainer)

- training of text encoder turned on
- learning rate 1e-6
- batch size 1
- clip skip 2
- number of training steps 64240

*Examples*

![native-00001-2487967310](https://huggingface.co/alea31415/YuriDiffusion/resolve/main/suremio-nozomizo-eilanya-maplesally/samples/native-00001-2487967310.png)
![native-00010-2248582025](https://huggingface.co/alea31415/YuriDiffusion/resolve/main/suremio-nozomizo-eilanya-maplesally/samples/native-00010-2248582025.png)
![native-00014-3296158149](https://huggingface.co/alea31415/YuriDiffusion/resolve/main/suremio-nozomizo-eilanya-maplesally/samples/native-00014-3296158149.png)
![native-00048-3129463315](https://huggingface.co/alea31415/YuriDiffusion/resolve/main/suremio-nozomizo-eilanya-maplesally/samples/native-00048-3129463315.png)
### LoRA embedding

Please refer to [LoRA Training Guide](https://rentry.org/lora_train)

- training of text encoder turned on
- learning rate 1e-4
- batch size 6
- clip skip 2
- number of training steps 69700 (50 epochs)

*Examples*

![lora-00012-1413123683](https://huggingface.co/alea31415/YuriDiffusion/resolve/main/suremio-nozomizo-eilanya-maplesally/samples/lora-00012-1413123683.png)
![lora-00014-1636023638](https://huggingface.co/alea31415/YuriDiffusion/resolve/main/suremio-nozomizo-eilanya-maplesally/samples/lora-00014-1636023638.png)
![lora-00015-969084934](https://huggingface.co/alea31415/YuriDiffusion/resolve/main/suremio-nozomizo-eilanya-maplesally/samples/lora-00015-969084934.png)
![lora-00071-2365359196](https://huggingface.co/alea31415/YuriDiffusion/resolve/main/suremio-nozomizo-eilanya-maplesally/samples/lora-00071-2365359196.png)