'How to remove normalization from Keras
I have an example of a keras neural network, and the example uses data normalization. I think that causes problems because the training set decreases over time and the results are labels that are still in the same value range. Can someone advise how to remove normalization in order to get the process working correctly? The code is:
normalizer = tf.keras.layers.Normalization(axis=-1)
normalizer.adapt(np.array(train_features))
def build_and_compile_model(norm):
model = keras.Sequential([
norm,
layers.Dense(neurons, activation='relu'),
layers.Dense(neurons, activation='relu'),
layers.Dense(neurons, activation='relu'),
layers.Dense(1)
])
model.compile(loss='mean_absolute_error', optimizer=tf.keras.optimizers.Adam(0.001))
return model
dnn_model = build_and_compile_model(normalizer)
Thank you so much!
Solution 1:[1]
You can simple remove the normalization layer from your model like so:
def build_and_compile_model(norm):
model = keras.Sequential([
layers.Dense(neurons, activation='relu'),
layers.Dense(neurons, activation='relu'),
layers.Dense(neurons, activation='relu'),
layers.Dense(1)
])
model.compile(loss='mean_absolute_error', optimizer=tf.keras.optimizers.Adam(0.001))
return model
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Ali Haider |
