Phips commited on
Commit
ae02bee
·
verified ·
1 Parent(s): edce83f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +54 -3
README.md CHANGED
@@ -1,3 +1,54 @@
1
- ---
2
- license: cc-by-4.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: cc-by-4.0
3
+ ---
4
+
5
+ [Link to Github Release](https://github.com/Phhofm/models/releases/tag/2xHFA2kCompact)
6
+
7
+ Name: 2xHFA2kCompact
8
+ Author: Philip Hofmann
9
+ Release Date: 18.04.2023
10
+ License: CC BY 4.0
11
+ Network: SRVGGNetCompact, with parameters: 600,652
12
+ Scale: 2
13
+ Purpose: A compact anime 2x upscaling model based on musl's HFA2k dataset
14
+
15
+ Iterations: 93,000
16
+ batch_size: 12
17
+ HR_size: 384
18
+ Epoch: 207 (require iter number per epoch: 214)
19
+ Dataset: HFA2k
20
+ Number of train images: 2568
21
+ OTF Training: Yes
22
+ Pretrained_Model_G: 4x_Compact_Pretrain
23
+ Training time: 24h+
24
+
25
+ Description: Compact 2x anime upscaler with otf compression and blur. The '2xHFA2kCompact.pth' (4.6 MB) is the original trained model file, the other model files are conversions using chaiNNer. Trained on musl's latest dataset release for Anime SISR, which has been extracted from modern anime films, where the selection criteria was high SNR, no DOF and high frequency information.
26
+
27
+ Examples: https://imgsli.com/MTcxNjA4
28
+
29
+ ![Example1](https://github.com/Phhofm/models/assets/14755670/644cbf4c-8e01-4052-9c41-e127818a3ce5)
30
+ ![Example2](https://github.com/Phhofm/models/assets/14755670/592dbf17-8268-4032-8a6e-fab277e69f6d)
31
+ ![Example3](https://github.com/Phhofm/models/assets/14755670/01132e47-32c8-437f-8cc8-b938270db697)
32
+ ![Example4](https://github.com/Phhofm/models/assets/14755670/8627f038-ab42-4e9a-9a97-2b8930487fa3)
33
+ ![Example5](https://github.com/Phhofm/models/assets/14755670/a68a1260-f044-4e93-b969-8c2fd61c1fe2)
34
+ ![Example6](https://github.com/Phhofm/models/assets/14755670/cbab3532-02cb-4277-9774-0a12450711a1)
35
+ ![Example7](https://github.com/Phhofm/models/assets/14755670/85659e12-a124-41ee-9688-57ecb04391cb)
36
+ ![Example8](https://github.com/Phhofm/models/assets/14755670/6797da63-d670-45c2-88a2-9d35e7cd81b2)
37
+ ![Example9](https://github.com/Phhofm/models/assets/14755670/04680869-99af-4003-8312-1a62b2ef1700)
38
+ ![Example10](https://github.com/Phhofm/models/assets/14755670/823a03b0-0208-4505-b997-d10a8b813b3d)
39
+ ![Example11](https://github.com/Phhofm/models/assets/14755670/512cfa9e-de86-45d1-ace8-7a24071812a0)
40
+ ![Example12](https://github.com/Phhofm/models/assets/14755670/89a0ca79-4c80-48df-bed3-fbd2e93e2f28)
41
+ ![Example13](https://github.com/Phhofm/models/assets/14755670/9db9da16-8463-480b-8823-7e6665ed293f)
42
+ ![Example14](https://github.com/Phhofm/models/assets/14755670/49f7d485-9238-476a-be77-dae731daff2b)
43
+ ![Example15](https://github.com/Phhofm/models/assets/14755670/cdfb393c-2f0f-4d18-8749-2c0d407cf17d)
44
+ ![Example16](https://github.com/Phhofm/models/assets/14755670/74158efe-4d6c-45e2-b57e-70096a4a4f68)
45
+ ![Example17](https://github.com/Phhofm/models/assets/14755670/61453d50-777f-47b3-9d3b-45274b01a9af)
46
+ ![Example18](https://github.com/Phhofm/models/assets/14755670/651a7600-77bc-472d-b019-cfd928d15077)
47
+ ![Example19](https://github.com/Phhofm/models/assets/14755670/fce75e50-ff81-46cc-82b6-c560a3500abe)
48
+ ![Example20](https://github.com/Phhofm/models/assets/14755670/76ef4b26-75e7-4fbb-8a9c-2e8baab903b0)
49
+ ![Example21](https://github.com/Phhofm/models/assets/14755670/ab7553e0-a6c1-4671-82ef-0e6148e047d2)
50
+ ![Example22](https://github.com/Phhofm/models/assets/14755670/6c240fc1-5cbb-4f31-85d6-b61f3f5ca608)
51
+ ![Example23](https://github.com/Phhofm/models/assets/14755670/95d06023-332a-4039-b4c1-ee06fe547127)
52
+ ![Example24](https://github.com/Phhofm/models/assets/14755670/5ad867da-9765-4364-a0aa-07951d6fe64a)
53
+ ![Example25](https://github.com/Phhofm/models/assets/14755670/112dcef6-eaef-4450-82ee-388d26e184af)
54
+