Please download the following datasets and pretrained models, and put them into the specified directory.
- PASCAL-5i
- COCO-20i
Final directory structure (only display used directories and files):
./data
├── COCO
│ ├── annotations
│ ├── train2014
│ ├── train2014_labels
│ ├── val2014
│ ├── val2014_labels
│ └── weights
├── VOCdevkit
│ └── VOC2012
│ ├── SegmentationClassAug
│ ├── JPEGImages
│ └── weights
└── README.md
- Download Training/Validation data (2G, tarball), and extract
VOCtrainval_11-May-2012.tar
to./data/
- Download SegmentationClassAug (34M, tarball, GoogleDrive or BaiduDrive Code: FPTr), and extract to
./data/VOCdevkit/VOC2012/
. This is an extended annotation set from SBD. - Precomputed cross-entropy weights (only used for training)
-
Option 1: Download from BaiduDrive Code: FPTr, and extract
pascal_weights.tar
to./data/VOCdevkit/VOC2012/
. Rename the directory name toweights
. -
Option 2: Generate from datasets:
# Dry run to ensure the output path are correct. cuda 0 python tools.py precompute_loss_weights with dataset=PASCAL dry_run=True # Then generate and save to disk. cuda 0 python tools.py precompute_loss_weights with dataset=PASCAL
-
-
Create directory
./data/COCO
-
Download 2014 Training images (13GB, zip), 2014 Val images (6GB, zip), 2014 Train/Val annotations (241M, zip), and extract them to
./data/COCO/
-
Generate offline labels
python tools.py gen_coco_labels with sets=train2014 python tools.py gen_coco_labels with sets=val2014
-
Precompute cross-entropy weights (only used for training)
-
Option 1: Download from BaiduDrive Code: FPTr, and extract
coco_weights.tar
to./data/COCO/
. Rename the directory name toweights
. -
Option 2: Generate from datasets:
cuda 0 python tools.py precompute_loss_weights with dataset=COCO save_byte=True
-