'tensorflow load_weights( ) gives different prediction when loaded from a different path

I am training a DCN model for ranking purpose. After training, I use model.save_weights(filepath) to save the weights. And I load the weights using model.load_weights(filepath) in a different file. Training and saving is done on colab and loading saved model is done on my local system. However the prediction done on colab and the prediction done in local system,even after compiling it,for the same input is different. The versions of tensorflow, tensorflow_recommnders and tensorflow_ranking are same on both. To avoid this issue, I tried saving the entire model with model.save('model.h5') but this throws me a warning below:

NotImplementedError: Saving the model to HDF5 format requires the model to be a Functional model or a Sequential model. It does not work for subclassed models, because such models are defined via the body of a Python method, which isn't safely serializable. Consider saving to the Tensorflow SavedModel format (by setting save_format="tf") or using `save_weights`.

I also tried only model.save(). But this again throws me the following:

WARNING:absl:Found untraced functions such as ranking_3_layer_call_fn, ranking_3_layer_call_and_return_conditional_losses, dense_layer_call_fn, dense_layer_call_and_return_conditional_losses, ranking_3_layer_call_fn while saving (showing 5 of 10). These functions will not be directly callable after loading. INFO:tensorflow:Assets written to: model/assets INFO:tensorflow:Assets written to: model/assets

I cannot save it with model.to_json() as well. Is there any way to solve this??



Solution 1:[1]

The possible reason for different predictions could be different platform's configuration settings for executing the same model. Make sure you have the same version of python and other dependent libraries installed in both platforms..

You can use tf.keras.backend.clear_session() as well before loading the saved model which helps to avoid clutter from old models and layers and to reset the memory state.

There are 2 ways to save the entire model - Saved model format and HDF5 format.

From the error, it's been recommended to use the Saved Model format to save your model.

  1. You can save and load a model in the SavedModel format using the below code:

    • To Save: tf.saved_model.save(model, path_to_dir)

    • To Load: model = tf.saved_model.load(path_to_dir)

  2. To save the model with custom objects, you must use HDF5, which means you need to pass the object to the custom_objects argument when loading the model.

    tf.keras.models.load_model(path, custom_objects={'CustomLayer': CustomLayer})

To have a better understanding of these methods you can check the links mentioned in this script.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 TFer2