'Keras cross entropy loss with missing labels in multi-objective training
I have a Keras neural network, using the Functional API, that has multiple outputs and multiple loss functions (some regression, some multi-class classification). I will always have a label for at least one of the outputs in training but commonly at least one will be missing.
I'm trying to write a custom categorical cross entropy loss function:
def custom_error_function(y_true, y_pred):
bool_finite = y_true != -1
loss = keras.losses.CategoricalCrossentropy(from_logits=True)
one_hotted = one_hot(np.int(boolean_mask(y_true, bool_finite)), depth=5)
return loss(one_hotted, boolean_mask(y_pred, bool_finite, axis=1))
where y_pred and y_true should have the same shape ([n_samples_in_batch, n_classes (5)]) and a value of -1 for y_true indicates missing.
But when I run this, I get
ValueError: in user code:
File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/keras/engine/training.py", line 1021, in train_function *
return step_function(self, iterator)
File "/var/folders/pn/c0hwfk8n7q9442628b1g_p1r0000gp/T/ipykernel_13239/802342025.py", line 12, in custom_error_function *
return loss(one_hotted, boolean_mask(y_pred, bool_finite, axis=1))
ValueError: Shapes (5,) and (None, 1) are incompatible
I'm a bit flummoxed and would appreciate any assistance. Thanks!
Solution 1:[1]
The problem comes from axis=1 in the loss call, the following should work:
def custom_error_function(y_true, y_pred):
bool_finite = y_true != -1
loss = keras.losses.CategoricalCrossentropy(from_logits=True)
return loss(tf.boolean_mask(y_true, bool_finite), tf.boolean_mask(y_pred, bool_finite))
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | elbe |
