When using test.py I get an error:
model = TransSARV2()
---->model.load_state_dict(torch.load("./pretrained_models/model.pth"))
model.eval()
[/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py](https://localhost:8080/#) in load_state_dict(self, state_dict, strict)
1603 if len(error_msgs) > 0:
1604 raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format(
-> 1605 self.__class__.__name__, "\n\t".join(error_msgs)))
1606 return _IncompatibleKeys(missing_keys, unexpected_keys)
1607
RuntimeError: Error(s) in loading state_dict for TransSARV2:
Missing key(s) in state_dict: "Tenc.patch_embed1.proj.weight", "Tenc.patch_embed1.proj.bias", "Tenc.patch_embed1.norm.weight", "Tenc.patch_embed1.norm.bias", "Tenc.patch_embed2.proj.weight", "Tenc.patch_embed2.proj.bias", "Tenc.patch_embed2.norm.weight", "Tenc.patch_embed2.norm.bias", "Tenc.patch_embed3.proj.weight", "Tenc.patch_embed3.proj.bias", "Tenc.patch_embed3.norm.weight", "Tenc.patch_embed3.norm.bias", "Tenc.patch_embed4.proj.weight", "Tenc.patch_embed4.proj.bias", "Tenc.patch_embed4.norm.weight", "Tenc.patch_embed4.norm.bias", "Tenc.patch_embed5.proj.weight", "Tenc.patch_embed5.proj.bias", "Tenc.patch_embed5.norm.weight", "Tenc.patch_embed5.norm.bias", "Tenc.block1.0.norm1.weight", "Tenc.block1.0.norm1.bias", "Tenc.block1.0.attn.q.weight", "Tenc.block1.0.attn.kv.weight", "Tenc.block1.0.attn.proj.weight", "Tenc.block1.0.attn.proj.bias", "Tenc.block1.0.attn.sr.weight", "Tenc.block1.0.attn.sr.bias", "Tenc.block1.0.attn.norm.weight", "Tenc.block1.0.attn.norm.bias", "Tenc.block1.0.norm2.weight", "Tenc.block1.0.norm2.bias", "Tenc.block1.0.mlp.fc1.weight", "Tenc.block1.0.mlp.fc1.bias", "Tenc.block1.0.mlp.dwconv.dwconv.weight", "Tenc.block1.0.mlp.dwconv.dwconv.bias", "Tenc.block1.0.mlp.fc2.weight", "Tenc.block1.0.mlp.fc2.bias", "Tenc.block1.1.norm1.weight", "Tenc.block1.1.norm1.bias", "Tenc.block1.1.attn.q.weight", "Tenc.block1.1.attn.kv.weight", "Tenc.block1.1.attn.proj.weight", "Tenc.block1.1.attn.proj.bias", "Tenc.block1.1.attn.sr.weight", "Tenc.block1.1.attn.sr.bias"...
Unexpected key(s) in state_dict: "module.Tenc.patch_embed1.proj.weight", "module.Tenc.patch_embed1.proj.bias", "module.Tenc.patch_embed1.norm.weight", "module.Tenc.patch_embed1.norm.bias", "module.Tenc.patch_embed2.proj.weight", "module.Tenc.patch_embed2.proj.bias", "module.Tenc.patch_embed2.norm.weight", "module.Tenc.patch_embed2.norm.bias", "module.Tenc.patch_embed3.proj.weight", "module.Tenc.patch_embed3.proj.bias", "module.Tenc.patch_embed3.norm.weight", "module.Tenc.patch_embed3.norm.bias", "module.Tenc.patch_embed4.proj.weight", "module.Tenc.patch_embed4.proj.bias", "module.Tenc.patch_embed4.norm.weight", "module.Tenc.patch_embed4.norm.bias", "module.Tenc.patch_embed5.proj.weight", "module.Tenc.patch_embed5.proj.bias", "module.Tenc.patch_embed5.norm.weight", "module.Tenc.patch_embed5.norm.bias", "module.Tenc.block1.0.norm1.weight", "module.Tenc.block1.0.norm1.bias", "module.Tenc.block1.0.attn.q.weight", "module.Tenc.block1.0.attn.kv.weight", "module.Tenc.block1.0.attn.proj.weight", "module.Tenc.block1.0.attn.proj.bias", "module.Tenc.block1.0.attn.sr.weight", "module.Tenc.block1.0.attn.sr.bias", "module.Tenc.block1.0.attn.norm.weight", "module.Tenc.block1.0.attn.norm.bias", "module.Tenc.block1.0.norm2.weight", "module.Tenc.block1.0.norm2.bias", "module.Tenc.block1.0.mlp.fc1.weight", "module.Tenc.block1.0.mlp.fc1.bias", "module.Tenc.block1.0.mlp.dwconv.dwconv.weight", "module.Tenc.block1.0.mlp.dwconv.dwconv.bias", "module.Tenc.block1.0.mlp.fc2.weight", "module.Tenc.bl...
So the weights file is either corrupted, or has incorrect weights.