'Is .ckpt file same as .pt file as pytorch model for Android?

I have a .ckpt checkpoint file used for image recognition from my data scientist and I would like to convert it to .pt file using instruction from the pytorch instruction website:https://pytorch.org/tutorials/beginner/deeplabv3_on_android.html

This is what I did:

**model = torch.load(os.path.join(model_path,'Image_segmentation.ckpt'), map_location=device)
model.eval())

scriptedm = torch.jit.script(model)
torch.jit.save(scriptedm, "Image_segmentation_Android.pt")** 

However I got the following error while trying to do so:

NotSupportedError                         Traceback (most recent call last)
<ipython-input-31-a8138feb2578> in <module>
      1 model = torch.load(os.path.join(model_path,'model_eyeglasses.ckpt'), map_location=device)
      2 model.eval()
----> 3 scriptedm = torch.jit.script(model)
      4 torch.jit.save(scriptedm, "model_eyeglasses_Android.pt")
      5 model.to(device)

After some reading, it seem that both file type can be used in Android development. I usually script in python and is very new to Android so I cannot be sure.

I was wondering if someone can confirm this? Unfortunately, I wont be able to get in contact with our data scientist for quite sometime to train another model in .pt format.

Many thanks for you help



Solution 1:[1]

There isn't an established difference between the file suffixes, because you can save arbitrary Python objects using torch.save, using any suffix you want. For example: you can directly save the model itself, or you can save a dictionary that includes multiple models. (Related answer: https://stackoverflow.com/a/70541507/13095028).

As for why JIT scripting failed however, there can be a variety of reasons. It could be that the tensor operations involved in the model genuinely is not supported (ref: https://pytorch.org/docs/stable/jit_unsupported.html).

It could also be a file loading error depending on how the model is saved. You can either save the model object directly, or just save the state_dict. They need to be loaded differently as per Pytorch docs: https://pytorch.org/tutorials/beginner/saving_loading_models.html#saving-loading-model-for-inference

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 tnwei