'Keras: activation layer with functional api

i am using the functional api of keras and would like to add an activation layer to my model. But when I do this, the performance of the model is much worse. Even if I use a simple RELU layer instead of specifying the activation function directly in the Dense Layer the performance is worse.

Shouldn't these two networks give about the same result?

hidden_Layer_1 = Dense(76, activation='relu')(input)
dropout_1 = Dropout(0.4)(hidden_Layer_1 )

and

...
hidden_Layer_1 = Dense(76)(input)
act1 = tf.keras.layers.ReLU()(hidden_Layer_1)
dropout_1 = Dropout(0.4)(act1)
...


Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source