-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
loading pretrained model #37
Comments
Hi, ssd = build_bidet_ssd('test', 300, 21, nms_conf_thre=0.03)
model_path=build_dir + "/pretrain/BiDet-SSD300-VOC_66.0.pth"
ssd.load_weights(model_path) |
Hi, model = build_bidet_ssd('train', 300, 80, Thanks |
Hi, That should also work. I think you need to set strict to False. Anyway the best is to use the load function. |
I tried your code. I find when the parameter num_class is set as 21, there is no problem. But I met the error when it is set as 80. Thanks |
Mind that the pre-trained SSD was trained in the PASCAL VOC dataset, which only has 21 classes. To use the COCO dataset, you need to train your own network. |
@samagalhaes Thanks for your kind reply, and yes @Jawae the pre-trained model is for VOC which has 21 classes, it can't be used for COCO which has 81 classes. |
The final question: can I know what is the loss value when the training converges? |
Sorry for my late reply. If I remember correctly, on VOC dataset, the |
But actually loss value doesn't mean final mAP, I will refer you to this issue. |
Hi,
I download your trained model BiDet-SSD300-VOC_66.0.pth and want to load it with the network in bidet.ssd.py. But I met an issue of mismatching size as follows:
RuntimeError: Error(s) in loading state_dict for BiDetSSD:
size mismatch for conf.0.weight: copying a param with shape torch.Size([84, 512, 3, 3]) from checkpoint, the shape in current model is torch.Size([324, 512, 3, 3]).
size mismatch for conf.0.bias: copying a param with shape torch.Size([84]) from checkpoint, the shape in current model is torch.Size([324]).
size mismatch for conf.1.weight: copying a param with shape torch.Size([126, 1024, 3, 3]) from checkpoint, the shape in current model is torch.Size([486, 1024, 3, 3]).
size mismatch for conf.1.bias: copying a param with shape torch.Size([126]) from checkpoint, the shape in current model is torch.Size([486]).
size mismatch for conf.2.weight: copying a param with shape torch.Size([126, 512, 3, 3]) from checkpoint, the shape in current model is torch.Size([486, 512, 3, 3]).
size mismatch for conf.2.bias: copying a param with shape torch.Size([126]) from checkpoint, the shape in current model is torch.Size([486]).
size mismatch for conf.3.weight: copying a param with shape torch.Size([126, 256, 3, 3]) from checkpoint, the shape in current model is torch.Size([486, 256, 3, 3]).
size mismatch for conf.3.bias: copying a param with shape torch.Size([126]) from checkpoint, the shape in current model is torch.Size([486]).
size mismatch for conf.4.weight: copying a param with shape torch.Size([84, 256, 3, 3]) from checkpoint, the shape in current model is torch.Size([324, 256, 3, 3]).
size mismatch for conf.4.bias: copying a param with shape torch.Size([84]) from checkpoint, the shape in current model is torch.Size([324]).
size mismatch for conf.5.weight: copying a param with shape torch.Size([84, 256, 3, 3]) from checkpoint, the shape in current model is torch.Size([324, 256, 3, 3]).
size mismatch for conf.5.bias: copying a param with shape torch.Size([84]) from checkpoint, the shape in current model is torch.Size([324]).
Can you advise how to fix it?
Thanks
The text was updated successfully, but these errors were encountered: