'tensorflow, Transfert learning by removing and adding layer

I try to use a model already trained, remove the output layer and replace it by a new not train one. I was using this code one years ago but now it don't work anymore!!?

model2= Model(inputs=source_model.input, outputs=source_model.layers[-2].output) # -2 
source_model.summary()
model2.summary()
new_mod = Sequential()
new_mod.add(model2)

# add/replace layer
new_mod.add(Dense(ny_train.shape[1], activation='linear',
                  kernel_initializer=initializers.he_uniform(),
                  name='new_final_output'))

I get this error at line "new_mod.add(model2)":

AssertionError: Could not compute output KerasTensor(type_spec=TensorSpec(shape=(None, 
50), dtype=tf.float32, name=None), name='dropout_8/Identity:0', description="created by 
layer 'dropout_8'")

Extra-question: It is possible to do that in a functionnal way?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source