'Converted TF Lite pre trained model not working correctly

I recently used this tensorflow model. I converted that compressed model to tflite file by these below codes:

converter = tf.lite.TFLiteConverter.from_saved_model("movenet_multipose_lightning")
converter.target_spec.supported_ops = [
  tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
  tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
converter.optimizations = [tf.lite.Optimize.DEFAULT]
tflite_quant_model = converter.convert()
tflite_model_files = pathlib.Path("movenet_multipose.tflite")
tflite_model_files.write_bytes(tflite_quant_model)

converter.target_spec.supported_types = [tf.float16]
tflite_fp16_model = converter.convert()
tflite_model_fp16_file = pathlib.Path("movenet_multipose_f16.tflite")
tflite_model_fp16_file.write_bytes(tflite_fp16_model)

Every things seems good and the tflite file exported correctly but when I use this converted model in android app it thrown an error like this:

Cannot copy to a TensorFlowLite tensor (serving_default_input:0) with 589824 bytes from a Java Buffer with 147456 bytes.

so whats the problem? How can I convert models like that correctly to use them in android apps?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source