'implementation of tf.nn.weighted_cross_entropy_with_logits
I am trying to add weights to classes in imbalanced data in a model with multi output layers in keras functional API
but i tried class_weight but couldn't get it to work either when my model was 1 output layer with onehot output or when dividing the out put into 50+ class each with different output layer class_weight require a dict in tf 2.80 i also tried tf.nn.weighted_cross_entropy_with_logits as a loss function but it doesnt work as a normal loss function it requires to be wrapped in a different function and all i found doesnt work for me and they doesnt explain it
some of the errors was:
Missing required positional argument
TypeError: Input 'y' of 'Mul' Op has type float32 that does not match type int64 of argument 'x'.
i use colab my data outputs:
x, y = next(valid_generator)
np.array(y).shape
(51, 32, 1)
51 array of (51,1) and batch of 32
input_shape=256
channels=1
inputs = keras.Input(shape=(input_shape, input_shape, channels), name="img")
x = layers.Conv2D(128, 3, activation="relu")(inputs)
pooling=layers.MaxPooling2D()(x)
conv=layers.Conv2D(64, 3, activation="relu")(pooling)
pooling=layers.MaxPooling2D()(conv)
conv=layers.Conv2D(64, 3, activation="relu")(pooling)
pooling=layers.MaxPooling2D()(conv)
conv=layers.Conv2D(64, 3, activation="relu")(pooling)
pooling=layers.MaxPooling2D()(conv)
flat=layers.Flatten()(pooling)
pflat=layers.Dense(256, activation="relu")(flat)
pflat = layers.Dropout(0.5)(pflat)#noraml
output_solo=[]
mmetrics={}
loss={}
class_weights={}
for i in df.iloc[:,35:-1].columns:
out = layers.Dense(1,activation="relu",name=f'{"_".join(i.split())}_out')(pflat)
output_solo.append(out)
mmetrics[f'{"_".join(i.split())}_out']=tf.keras.metrics.Accuracy(name=f'Accuracy{i}out')
loss[f'{"_".join(i.split())}_out']=weighted_binary_crossentropy(40)
class_weights[f'{"_".join(i.split())}_out']={0: df[i].mean() , 1:1}
model = keras.Model(inputs, output_solo , name="shit")
model.summary()
keras.utils.plot_model(model, "shit.png", show_shapes=True)
model.compile(
loss = loss ,
metrics =mmetrics,
optimizer = tf.keras.optimizers.Adam()
)
plateau = tf.keras.callbacks.ReduceLROnPlateau(monitor='BinaryAccuracy_normal',
patience = 1,
factor = 0.01,
mode = 'max')
early_stopping = tf.keras.callbacks.EarlyStopping(monitor = 'BinaryAccuracy_normal',
patience = 3,
restore_best_weights=True,
mode = 'max')
check_point=tf.keras.callbacks.ModelCheckpoint(
filepath='/content/drive/MyDrive/save',
monitor="BinaryAccuracy_normal",
verbose=0,
save_best_only=True,
save_weights_only=True,
mode="auto",
save_freq="epoch")
num_epochs = 1
hist = model.fit(
train_generator,
epochs=num_epochs, steps_per_epoch=400,
validation_data=valid_generator,
validation_steps=400,
callbacks=[check_point])
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
