Browse Source

update readme

Xintao 3 years ago
parent
commit
3e1f780f51
2 changed files with 48 additions and 46 deletions
  1. 2 0
      README.md
  2. 46 46
      Training.md

+ 2 - 0
README.md

@@ -23,6 +23,8 @@ We extend the powerful ESRGAN to a practical restoration application (namely, Re
 - :white_check_mark: [The inference code](inference_realesrgan.py) supports: 1) **tile** options; 2) images with **alpha channel**; 3) **gray** images; 4) **16-bit** images.
 - :white_check_mark: [The inference code](inference_realesrgan.py) supports: 1) **tile** options; 2) images with **alpha channel**; 3) **gray** images; 4) **16-bit** images.
 - :white_check_mark: The training codes have been released. A detailed guide can be found in [Training.md](Training.md).
 - :white_check_mark: The training codes have been released. A detailed guide can be found in [Training.md](Training.md).
 
 
+---
+
 If Real-ESRGAN is helpful in your photos/projects, please help to :star: this repo. Thanks:blush: <br>
 If Real-ESRGAN is helpful in your photos/projects, please help to :star: this repo. Thanks:blush: <br>
 Other recommended projects: &emsp; :arrow_forward: [GFPGAN](https://github.com/TencentARC/GFPGAN) &emsp; :arrow_forward: [BasicSR](https://github.com/xinntao/BasicSR) &emsp; :arrow_forward: [facexlib](https://github.com/xinntao/facexlib)
 Other recommended projects: &emsp; :arrow_forward: [GFPGAN](https://github.com/TencentARC/GFPGAN) &emsp; :arrow_forward: [BasicSR](https://github.com/xinntao/BasicSR) &emsp; :arrow_forward: [facexlib](https://github.com/xinntao/facexlib)
 
 

+ 46 - 46
Training.md

@@ -139,62 +139,62 @@ You can merge several folders into one meta_info txt. Here is the example:
 
 
 You can finetune Real-ESRGAN on your own dataset. Typically, the fine-tuning process can be divided into two cases:
 You can finetune Real-ESRGAN on your own dataset. Typically, the fine-tuning process can be divided into two cases:
 
 
-1. [generate degraded images on the fly](#Generate-degraded-images-on-the-fly)
-1. [use your own **paired** data(#Use-paired-training-data)
+1. [Generate degraded images on the fly](#Generate-degraded-images-on-the-fly)
+1. [Use your own **paired** data](#Use-paired-training-data)
 
 
 ### Generate degraded images on the fly
 ### Generate degraded images on the fly
 
 
-Only high-resolution images are required. The low-quality images are generated with the degradation process in Real-ESRGAN during trainig.
+Only high-resolution images are required. The low-quality images are generated with the degradation process described in Real-ESRGAN during trainig.
 
 
-**Prepare dataset**
+**1. Prepare dataset**
 
 
 See [this section](#dataset-preparation) for more details.
 See [this section](#dataset-preparation) for more details.
 
 
-**Download pre-trained models**
+**2. Download pre-trained models**
 
 
 Download pre-trained models into `experiments/pretrained_models`.
 Download pre-trained models into `experiments/pretrained_models`.
 
 
-*RealESRGAN_x4plus.pth*
-
+- *RealESRGAN_x4plus.pth*:
     ```bash
     ```bash
     wget https://github.com/xinntao/Real-ESRGAN/releases/download/v0.1.0/RealESRGAN_x4plus.pth -P experiments/pretrained_models
     wget https://github.com/xinntao/Real-ESRGAN/releases/download/v0.1.0/RealESRGAN_x4plus.pth -P experiments/pretrained_models
     ```
     ```
 
 
-*RealESRGAN_x4plus_netD.pth*
-
+- *RealESRGAN_x4plus_netD.pth*:
     ```bash
     ```bash
     wget https://github.com/xinntao/Real-ESRGAN/releases/download/v0.2.2.3/RealESRGAN_x4plus_netD.pth -P experiments/pretrained_models
     wget https://github.com/xinntao/Real-ESRGAN/releases/download/v0.2.2.3/RealESRGAN_x4plus_netD.pth -P experiments/pretrained_models
     ```
     ```
 
 
-**Finetune**
+**3. Finetune**
 
 
 Modify [options/finetune_realesrgan_x4plus.yml](options/finetune_realesrgan_x4plus.yml) accordingly, especially the `datasets` part:
 Modify [options/finetune_realesrgan_x4plus.yml](options/finetune_realesrgan_x4plus.yml) accordingly, especially the `datasets` part:
-    ```yml
-    train:
-        name: DF2K+OST
-        type: RealESRGANDataset
-        dataroot_gt: datasets/DF2K  # modify to the root path of your folder
-        meta_info: realesrgan/meta_info/meta_info_DF2Kmultiscale+OST_sub.txt  # modify to your own generate meta info txt
-        io_backend:
-            type: disk
-    ```
+
+```yml
+train:
+    name: DF2K+OST
+    type: RealESRGANDataset
+    dataroot_gt: datasets/DF2K  # modify to the root path of your folder
+    meta_info: realesrgan/meta_info/meta_info_DF2Kmultiscale+OST_sub.txt  # modify to your own generate meta info txt
+    io_backend:
+        type: disk
+```
 
 
 We use four GPUs for training. We use the `--auto_resume` argument to automatically resume the training if necessary.
 We use four GPUs for training. We use the `--auto_resume` argument to automatically resume the training if necessary.
-    ```bash
-    CUDA_VISIBLE_DEVICES=0,1,2,3 \
-    python -m torch.distributed.launch --nproc_per_node=4 --master_port=4321 realesrgan/train.py -opt options/finetune_realesrgan_x4plus.yml --launcher pytorch --auto_resume
-    ```
 
 
-### Use paired training data
+```bash
+CUDA_VISIBLE_DEVICES=0,1,2,3 \
+python -m torch.distributed.launch --nproc_per_node=4 --master_port=4321 realesrgan/train.py -opt options/finetune_realesrgan_x4plus.yml --launcher pytorch --auto_resume
+```
+
+### Use your own paired data
 
 
 You can also finetune RealESRGAN with your own paired data. It is more similar to fine-tuning ESRGAN.
 You can also finetune RealESRGAN with your own paired data. It is more similar to fine-tuning ESRGAN.
 
 
-**Prepare dataset**
+**1. Prepare dataset**
 
 
 Assume that you already have two folders:
 Assume that you already have two folders:
 
 
-- gt folder (Ground-truth, high-resolution images): datasets/DF2K/DIV2K_train_HR_sub
-- lq folder (Low quality, low-resolution images): datasets/DF2K/DIV2K_train_LR_bicubic_X4_sub
+- **gt folder** (Ground-truth, high-resolution images): *datasets/DF2K/DIV2K_train_HR_sub*
+- **lq folder** (Low quality, low-resolution images): *datasets/DF2K/DIV2K_train_LR_bicubic_X4_sub*
 
 
 Then, you can prepare the meta_info txt file using the script [scripts/generate_meta_info_pairdata.py](scripts/generate_meta_info_pairdata.py):
 Then, you can prepare the meta_info txt file using the script [scripts/generate_meta_info_pairdata.py](scripts/generate_meta_info_pairdata.py):
 
 
@@ -202,18 +202,16 @@ Then, you can prepare the meta_info txt file using the script [scripts/generate_
 python scripts/generate_meta_info_pairdata.py --input datasets/DF2K/DIV2K_train_HR_sub datasets/DF2K/DIV2K_train_LR_bicubic_X4_sub --meta_info datasets/DF2K/meta_info/meta_info_DIV2K_sub_pair.txt
 python scripts/generate_meta_info_pairdata.py --input datasets/DF2K/DIV2K_train_HR_sub datasets/DF2K/DIV2K_train_LR_bicubic_X4_sub --meta_info datasets/DF2K/meta_info/meta_info_DIV2K_sub_pair.txt
 ```
 ```
 
 
-**Download pre-trained models**
+**2. Download pre-trained models**
 
 
 Download pre-trained models into `experiments/pretrained_models`.
 Download pre-trained models into `experiments/pretrained_models`.
 
 
-*RealESRGAN_x4plus.pth*
-
+- *RealESRGAN_x4plus.pth*
     ```bash
     ```bash
     wget https://github.com/xinntao/Real-ESRGAN/releases/download/v0.1.0/RealESRGAN_x4plus.pth -P experiments/pretrained_models
     wget https://github.com/xinntao/Real-ESRGAN/releases/download/v0.1.0/RealESRGAN_x4plus.pth -P experiments/pretrained_models
     ```
     ```
 
 
-*RealESRGAN_x4plus_netD.pth*
-
+- *RealESRGAN_x4plus_netD.pth*
     ```bash
     ```bash
     wget https://github.com/xinntao/Real-ESRGAN/releases/download/v0.2.2.3/RealESRGAN_x4plus_netD.pth -P experiments/pretrained_models
     wget https://github.com/xinntao/Real-ESRGAN/releases/download/v0.2.2.3/RealESRGAN_x4plus_netD.pth -P experiments/pretrained_models
     ```
     ```
@@ -221,19 +219,21 @@ Download pre-trained models into `experiments/pretrained_models`.
 **Finetune**
 **Finetune**
 
 
 Modify [options/finetune_realesrgan_x4plus_pairdata.yml](options/finetune_realesrgan_x4plus_pairdata.yml) accordingly, especially the `datasets` part:
 Modify [options/finetune_realesrgan_x4plus_pairdata.yml](options/finetune_realesrgan_x4plus_pairdata.yml) accordingly, especially the `datasets` part:
-    ```yml
-    train:
-        name: DIV2K
-        type: RealESRGANPairedDataset
-        dataroot_gt: datasets/DF2K  # modify to the root path of your folder
-        dataroot_lq: datasets/DF2K  # modify to the root path of your folder
-        meta_info: datasets/DF2K/meta_info/meta_info_DIV2K_sub_pair.txt  # modify to the root path of your folder
-        io_backend:
-        type: disk
-    ```
+
+```yml
+train:
+    name: DIV2K
+    type: RealESRGANPairedDataset
+    dataroot_gt: datasets/DF2K  # modify to the root path of your folder
+    dataroot_lq: datasets/DF2K  # modify to the root path of your folder
+    meta_info: datasets/DF2K/meta_info/meta_info_DIV2K_sub_pair.txt  # modify to the root path of your folder
+    io_backend:
+    type: disk
+```
 
 
 We use four GPUs for training. We use the `--auto_resume` argument to automatically resume the training if necessary.
 We use four GPUs for training. We use the `--auto_resume` argument to automatically resume the training if necessary.
-    ```bash
-    CUDA_VISIBLE_DEVICES=0,1,2,3 \
-    python -m torch.distributed.launch --nproc_per_node=4 --master_port=4321 realesrgan/train.py -opt options/finetune_realesrgan_x4plus_pairdata.yml --launcher pytorch --auto_resume
-    ```
+
+```bash
+CUDA_VISIBLE_DEVICES=0,1,2,3 \
+python -m torch.distributed.launch --nproc_per_node=4 --master_port=4321 realesrgan/train.py -opt options/finetune_realesrgan_x4plus_pairdata.yml --launcher pytorch --auto_resume
+```