'How to fix the error : 'The size of byte buffer and the shape do not match' in Android Studio
I am learning Android Development and working to serve an image captioning ML model using TensorFlow lite. But the code fails with an error saying 'The size of byte buffer and the shape do not match' when executing below line of code:
inputFeature0.loadBuffer(byteBuffer0);
There are similar questions in the forum but I could not comprehend them, can someone please guide me on this and suggest the relevant solution?
Code in the tflite model
try {
Model3 model = Model3.newInstance(context);
// Creates inputs for reference.
TensorBuffer inputFeature0 = TensorBuffer.createFixedSize(new int[]{1, 37}, DataType.FLOAT32);
inputFeature0.loadBuffer(byteBuffer);
TensorBuffer inputFeature1 = TensorBuffer.createFixedSize(new int[]{1, 4096}, DataType.FLOAT32);
inputFeature1.loadBuffer(byteBuffer);
// Runs model inference and gets result.
Model3.Outputs outputs = model.process(inputFeature0, inputFeature1);
TensorBuffer outputFeature0 = outputs.getOutputFeature0AsTensorBuffer();
// Releases model resources if no longer used.
model.close();
} catch (IOException e) {
// TODO Handle the exception
}
Code in MainActivity.java
public void onClick(View Predict) {
Bitmap img0 = Bitmap.createScaledBitmap(img, 1, 37, true);
Bitmap img1 = Bitmap.createScaledBitmap(img, 1, 4096, true);
try {
Model3 model = Model3.newInstance(getApplicationContext());
TensorImage tensorImage= new TensorImage(DataType.FLOAT32);
tensorImage.load(img0);
ByteBuffer byteBuffer0=tensorImage.getBuffer();
tensorImage.load(img1);
ByteBuffer byteBuffer1=tensorImage.getBuffer();
// Creates inputs for reference.
TensorBuffer inputFeature0 = TensorBuffer.createFixedSize(new int[]{1, 37}, DataType.FLOAT32);
inputFeature0.loadBuffer(byteBuffer0);
TensorBuffer inputFeature1 = TensorBuffer.createFixedSize(new int[]{1, 4096}, DataType.FLOAT32);
inputFeature1.loadBuffer(byteBuffer1);
// Runs model inference and gets result.
Model3.Outputs outputs = model.process(inputFeature0, inputFeature1);
TensorBuffer outputFeature0 = outputs.getOutputFeature0AsTensorBuffer();
// Releases model resources if no longer used.
model.close();
tv.setText((int)outputFeature0.getFloatArray()[0]);
} catch (IOException e) {
// TODO Handle the exception
}
}
});
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
