File size: 2,560 Bytes
13be7b3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
---
license: cc-by-4.0
pipeline_tag: image-to-image
tags:
- pytorch
- super-resolution
- pretrain
---

[Link to Github Release](https://github.com/Phhofm/models/releases/tag/4xmssim_realplksr_dysample_pretrain)  

# 4xmssim_realplksr_dysample_pretrain
Scale: 4  
Architecture: [RealPLKSR with Dysample](https://github.com/muslll/neosr/?tab=readme-ov-file#supported-archs)  
Architecture Option: [realplksr](https://github.com/muslll/neosr/blob/master/neosr/archs/realplksr_arch.py)  

Author: Philip Hofmann  
License: CC-BY-0.4  
Purpose: Pretrained  
Subject: Photography  
Input Type: Images  
Release Date: 27.06.2024  

Dataset: [nomosv2](https://github.com/muslll/neosr/?tab=readme-ov-file#-datasets)  
Dataset Size: 6000  
OTF (on the fly augmentations): No  
Pretrained Model: None (=From Scratch)  
Iterations: 200'000  
Batch Size: 8  
GT Size: 192, 512  

Description: [Dysample](https://arxiv.org/pdf/2308.15085) had been recently added to RealPLKSR, which from what I had seen can resolve or help avoid the checkerboard / grid pattern on inference outputs. So with the [commits from three days ago, the 24.06.24, on neosr](https://github.com/muslll/neosr/commits/master/?since=2024-06-24&until=2024-06-24), I wanted to create a 4x photo pretrain I can then use to train more realplksr models with dysample specifically to stabilize training at the beginning.

Showcase:  
[Imgsli](https://imgsli.com/Mjc0OTA1)  
[Slowpics](https://slow.pics/c/I9grkcqM)  

![Example1](https://github.com/Phhofm/models/assets/14755670/25406570-3388-4d22-ae68-68560c8bd917)
![Example2](https://github.com/Phhofm/models/assets/14755670/bf3e946b-9646-441e-a15e-9bbc290a8885)
![Example3](https://github.com/Phhofm/models/assets/14755670/7e5ccec4-f485-4e02-a76b-9ec8827ee663)
![Example4](https://github.com/Phhofm/models/assets/14755670/9c665c12-c30f-4b7f-a0a6-46a3645633fe)
![Example5](https://github.com/Phhofm/models/assets/14755670/4868ba82-fe8a-468c-bfd4-1f471d3ba361)
![Example6](https://github.com/Phhofm/models/assets/14755670/3ee42167-721f-4866-8529-d0a19f121ff1)
![Example7](https://github.com/Phhofm/models/assets/14755670/a3a23246-8f58-4810-823b-10b701bdbced)
![Example8](https://github.com/Phhofm/models/assets/14755670/f6665676-0845-4019-a476-ed253306f838)
![Example9](https://github.com/Phhofm/models/assets/14755670/3da3a0d9-ef5e-4e66-a1d1-03dae2bf437b)
![Example10](https://github.com/Phhofm/models/assets/14755670/3468d468-09a0-42e4-945b-f63e48d9744b)
![Example11](https://github.com/Phhofm/models/assets/14755670/83cfa6d2-3fc9-4099-834a-129a6e87e4de)