'ValueError: Data cardinality is ambiguous (x sizes: 60000 y sizes: 10000 Make sure all arrays contain the same number of samples.)
I'm using mnist dataset for my project on VsCode IDE. Following is the complete code. what is it that I'm doing wrong and how can I solve this error?
# Import Libraries
import numpy as np
import matplotlib.pyplot as mtplt
import matplotlib.image as mpimg
import seaborn as sns
import tensorflow as tf
from tensorflow.python.framework import ops
from PIL import Image
# Import fashion-mnist
(train_images, train_labels), (test_images, test_labels) = tf.keras.datasets.fashion_mnist.load_data()
class_names = ['T-shirt/Top', 'Trouser', 'Pullover', 'Dress', 'Coat', 'Sandal', 'Shirt', 'Sneaker', 'Ankel Boot']
train_images = train_images / 255.0
test_images = test_images / 255.0
model = tf.keras.Sequential()
model.add(tf.keras.layers.Flatten(input_shape = (28, 28)))
model.add(tf.keras.layers.Dense(128, activation='relu'))
model.add(tf.keras.layers.Dense(10, activation='softmax'))
model.compile(optimizer=tf.optimizers.Adam(), loss='sparse_categorical_crossentropy', metrics=['accuracy'])
model.fit(train_images, train_labels, epochs=5)
# Test with 10,000 images
test_loss, test_acc = model.evaluate(train_images, test_labels)
print('10,000 test image accuracy: ', test_acc)
I get the following error once I run the code. Below is the complete traceback of the most recent call.
Epoch 1/5
1875/1875 [==============================] - 13s 6ms/step - loss: 0.5031 - accuracy: 0.8248
Epoch 2/5
1875/1875 [==============================] - 16s 8ms/step - loss: 0.3777 - accuracy: 0.8655
Epoch 3/5
1875/1875 [==============================] - 9s 5ms/step - loss: 0.3365 - accuracy: 0.8781
Epoch 4/5
1875/1875 [==============================] - 9s 5ms/step - loss: 0.3139 - accuracy: 0.8850
Epoch 5/5
1875/1875 [==============================] - 9s 5ms/step - loss: 0.2951 - accuracy: 0.8915
Traceback (most recent call last):
File "c:/Users/coderex/Documents/Py3.0/AIO/trial.py", line 35, in <module>
test_loss, test_acc = model.evaluate(train_images, test_labels)
File "C:\Users\coderex\Documents\Py3.0\AIO\myenv\lib\site-packages\keras\utils\traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "C:\Users\coderex\Documents\Py3.0\AIO\myenv\lib\site-packages\keras\engine\data_adapter.py", line 1653, in _check_data_cardinality
raise ValueError(msg)
ValueError: Data cardinality is ambiguous:
x sizes: 60000
y sizes: 10000
Make sure all arrays contain the same number of samples.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
