''ListWrapper' object has no attribute 'get_config' error when doing gridsearch

I have to do grid search on my DNN. But I am getting an error on GridSearchCV function. Here is the code for creating and compiling the model I used and also when I tried to do the grid search.

import matplotlib.pyplot as plt
import numpy as np
import tensorflow as tf
from keras.models import Sequential
from keras.layers import Dense, Dropout
import keras,sklearn
import os
os.environ['KMP_DUPLICATE_LIB_OK']='True'
import tensorflow as tf
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '2'
from sklearn.model_selection import GridSearchCV
from keras.wrappers.scikit_learn import KerasClassifier


CASE = 1
if CASE == 1:
    model = Sequential()
    model.add(Dense(L,input_shape=(L,), activation ="relu"))
    model.add(Dense(20, activation= 'relu'))
    model.add(Dense(20, activation = 'relu'))
    model.add(Dropout(0.2))
    model.add(Dense(1, activation = 'sigmoid'))
    nepoch = 400
if CASE == 2:
    model = Sequential()
    model.add(Dense(L, input_shape=(L,), activation= 'sigmoid'))
    model.add(Dense(3, activation= 'sigmoid'))
    model.add(Dense(1, activation= 'sigmoid'))
    nepoch = 400
    
model.compile(loss = 'binary_crossentropy',
             optimizer = optimizer,
             metrics = ['accuracy'])
model_gridsearch = KerasClassifier(build_fn=model, 
                        epochs=1, 
                        batch_size=50, 
                        verbose=1)

optimizer = ['sgd', 'rmsprop', 'adadelta', 'adam', 'adamax'] 

param_grid = dict(optimizer=optimizer)
grid = GridSearchCV(estimator=model_gridsearch, param_grid=param_grid, n_jobs=1, cv=4)
grid_result = grid.fit(x_train,y_train)

The error I am getting is on grid_result = grid.fit(x_train,y_train) which says AttributeError: 'ListWrapper' object has no attribute 'get_config' My tensorflow version is 2.8.0 if it helps.



Solution 1:[1]

Issue with optimizer list. In case of multiple optimizers, you can use optimizer wrapper API tfa.optimizers.MultiOptimizer.

optimizers = [
    tf.keras.optimizers.SGD(learning_rate=1e-4),
    tf.keras.optimizers.RMSprop(learning_rate=1e-4),
    tf.keras.optimizers.Adadelta(learning_rate=1e-4),
    tf.keras.optimizers.Adam(learning_rate=1e-2),
    tf.keras.optimizers.Adamax(learning_rate=1e-4)
]
optimizers_and_layers = [(optimizers[0], model.layers[0:]), (optimizers[1], model.layers[1:2]),(optimizers[2], model.layers[3:4]), (optimizers[3:], .......]
optimizer = tfa.optimizers.MultiOptimizer(optimizers_and_layers)
combined_model.compile(optimizer=optimizer, loss='mse', metrics=['mse'])

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 TFer