'TFLite Failed to Allocate Tensor

I have an exported Frozen Graph .pb file, and converted it to tflite using

graph_def_file = "model.pb"
input_arrays = ["Placeholder"]
input_shape = {"Placeholder": [1024, 2048, 3]}
output_arrays = ["final_output"]
converter = tf.compat.v1.lite.TFLiteConverter.from_frozen_graph(
    graph_def_file, input_arrays, output_arrays, input_shape)

tflite_model = converter.convert()
with open("my_model.tflite", "wb") as f:
        f.write(tflite_model)

And I could not restore my model with

interpreter = tf.lite.Interpreter(model_path="my_model.tflite")
interpreter.resize_tensor_input(0, [1024, 2048, 3], strict=True)
interpreter.allocate_tensors()
---------------------------------------------------------------------------
RuntimeError                              Traceback (most recent call last)
<ipython-input-28-3cfaee7dc51c> in <module>
      1 interpreter = tf.lite.Interpreter(model_path=model_path)
      2 interpreter.resize_tensor_input(0, [1024, 2048, 3], strict=True)
----> 3 interpreter.allocate_tensors()

~/.local/lib/python3.9/site-packages/tensorflow/lite/python/interpreter.py in allocate_tensors(self)
    512   def allocate_tensors(self):
    513     self._ensure_safe()
--> 514     return self._interpreter.AllocateTensors()
    515 
    516   def _safe_to_run(self):

RuntimeError: tensorflow/lite/kernels/conv.cc:349 input->dims->data[3] != filter->dims->data[3] (65 != 64)Node number 11 (CONV_2D) failed to prepare.Failed to apply the default TensorFlow Lite delegate indexed at 0.

However, I can successfully reload my .pb file frozen graph by using

with tf.gfile.GFile('my_model.pb', "rb") as pb:
    graph_def = tf.GraphDef()
    graph_def.ParseFromString(pb.read())
with tf.Graph().as_default() as graph:
    tf.import_graph_def(
            graph_def,
            name="", 
            )

and also successfully generated result by

node_in = graph.get_tensor_by_name('Placeholder:0') 
node_out = graph.get_tensor_by_name('final_output:0') 

with tf.Session(graph=graph) as sess:  # Session()
    # sess.run(tf.global_variables_initializer()) 
    feed_dict = {node_in: input_img}  
    pred = sess.run(node_out, feed_dict) 
    print(pred)
    sess.close()

I've checked the node 11 of the .tflite file in Netron, but everything seemed fine. Node 11 What could be the problem?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source