-
Clone the repository.
https://github.com/zhoushen1/DCMPNet
-
Install PyTorch 1.12.0 and torchvision 0.13.0.
conda install -c pytorch pytorch torchvision
-
Install the other dependencies.
pip install -r requirements.txt
Download the RESIDE datasets from here.
You need to put the depth
into the file and you can download the depth
from (Link:https://pan.baidu.com/s/1sNoMlcehMUtSLRuRvsjKKw?pwd=dbcw
code:dbcw)
The final file path should be the same as the following (please check it carefully):
┬─ save_models
│ ├─ indoor
│ │ ├─ DIACMPN-dehaze-Indoor.pth
│ │ ├─ DIACMPN-depth-Indoor.pth
│ │ └─ ... (model name)
│ └─ ... (exp name)
└─ data
├─ RESIDE-IN
│ ├─ train
│ │ ├─ GT
│ │ │ └─ ... (image filename)
│ │ └─ hazy
│ │ └─ ... (image filename)
│ └─ test
│ │ ├─ GT
│ │ │ └─ ... (image filename)
│ │ └─ hazy
│ │ └─ ... (image filename)
└─ ... (dataset name)
To customize the training settings for each experiment, navigate to the configs
folder. Modify the configurations as needed.
After adjusting the settings, use the following script to initiate the training of the model:
CUDA_VISIBLE_DEVICES=X python train.py --model (model name) --dataset (dataset name) --exp (exp name)
For example, we train the DIACMPN-dehaze-Indoor on the ITS:
CUDA_VISIBLE_DEVICES=0 python train.py --model DIACMPN-dehaze-Indoor --dataset RESIDE-IN --exp indoor
Run the following script to evaluate the trained model with a single GPU.
CUDA_VISIBLE_DEVICES=X python test.py --model (model name) --dataset (dataset name) --exp (exp name)
For example, we test the DIACMPN-dehaze-Indoor on the SOTS indoor set:
CUDA_VISIBLE_DEVICES=0 python test.py --model DIACMPN-dehaze-Indoor --dataset RESIDE-IN --exp indoor
Don't hesitate to contact me if you meet any problems when using this code.
Zhou Shen
Faculty of Information Engineering and Automation
Kunming University of Science and Technology
Email: zhoushennn@163.com