'Will y_train change during training in keras?

I am trying to write a custom metric in keras like this:

def C_index1(E,T):
  T = T.reshape(len(T),1)
  T_ind = T > T.T
  E_ind = E.reshape(len(E),1)
  E_ind = np.matmul(E_ind,np.ones((1,len(E)))).T
  T_ind = K.cast(T_ind, tf.float64)
  E_ind = K.cast(E_ind, tf.float64)
  
  def metric(y_true, y_pred):
    H = K.reshape(y_true,(len(T),1))
    ita_ind = K.cast(H > K.transpose(H),tf.float64)
    M2 = T_ind * E_ind
    M1 = M2 * ita_ind   # M1 consists of E, T and y_true
    C_index = K.sum(M1) # M1 consists of E, T and y_true
    return C_index
  
  return metric


model = tf.keras.Sequential([
                             tf.keras.layers.Flatten(input_dim=10),
                             tf.keras.layers.Dense(17, activation='relu'),
                             tf.keras.layers.Dense(17, activation='relu'),
                             #tf.keras.layers.Dense(17, activation='selu'),
                             tf.keras.layers.Dropout(0.1),
                             tf.keras.layers.Dense(1)
])

order_t = find_order(df['t'])
x_train = df['x'][order_t,:]
y_train = df['hr'][order_t]
T = df['t'][order_t]
E = df['e'][order_t]
base_learning = 0.001
optimizer= tf.keras.optimizers.Adam(learning_rate=0.3194*base_learning)
model.compile(optimizer=optimizer, 
              loss="mean_absolute_error",
              metrics=C_index1(E,T)
             )


history = model.fit(x_train, y_train, 
                    batch_size=len(y_train),  # full batch size
                    epochs=100,
                    )

Epoch 1/100
1/1 [==============================] - 1s 549ms/step - loss: 1.1079 - metric: 27.0000
Epoch 2/100
1/1 [==============================] - 0s 7ms/step - loss: 1.1524 - metric: 27.0000
Epoch 3/100
1/1 [==============================] - 0s 7ms/step - loss: 1.1813 - metric: 18.0000
Epoch 4/100
1/1 [==============================] - 0s 5ms/step - loss: 1.1465 - metric: 21.0000
Epoch 5/100
1/1 [==============================] - 0s 9ms/step - loss: 1.1400 - metric: 28.0000
Epoch 6/100
1/1 [==============================] - 0s 7ms/step - loss: 1.1629 - metric: 32.0000
Epoch 7/100
1/1 [==============================] - 0s 8ms/step - loss: 1.1363 - metric: 31.0000
Epoch 8/100
1/1 [==============================] - 0s 7ms/step - loss: 1.1471 - metric: 24.0000
Epoch 9/100
1/1 [==============================] - 0s 10ms/step - loss: 1.0663 - metric: 18.0000
Epoch 10/100
1/1 [==============================] - 0s 8ms/step - loss: 1.0967 - metric: 17.0000
Epoch 11/100
1/1 [==============================] - 0s 7ms/step - loss: 1.0906 - metric: 14.0000
Epoch 12/100
1/1 [==============================] - 0s 15ms/step - loss: 1.1285 - metric: 21.0000
Epoch 13/100
1/1 [==============================] - 0s 10ms/step - loss: 1.1132 - metric: 16.0000
Epoch 14/100
1/1 [==============================] - 0s 9ms/step - loss: 1.0886 - metric: 8.0000
Epoch 15/100
1/1 [==============================] - 0s 9ms/step - loss: 1.1226 - metric: 19.0000
Epoch 16/100
1/1 [==============================] - 0s 8ms/step - loss: 1.0699 - metric: 27.0000

My returned custom metric value is all about y_true(y_train) and outer parameters E, T. E and T are predefined. Moreover, I use full batch size so y_train is fixed. I expect no changes of the metric value during training since y_true(y_train), E and T are all fixed value. However there are indeed changes about the metric when I use model.fit() to train the dataset. Why would this happen? Is y_train changed in my custom metric during training?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source