'python Ftrl optimizer does not distribute classes correctly
I have a task: to create RNN: 1 LSTM layer, softmax activation function, FTRL optimizer. I follow the example and the adam optimizer is used there. With this optimizer, I get this graph: Adam optimizer
I change the optimizer to FTRL and get this graph: FTRL optimizer
Who can explain what the problem is? How can I get a class distribution graph like with the adam optimizer?
Code:
import tensorflow as tf
import numpy as np
import matplotlib.pyplot as plt
from sklearn.datasets import make_classification
from tensorflow.python.keras.layers import Dense
from sklearn.metrics import accuracy_score
def plot_scatter(pred):
plt.figure()
plt.scatter(range(0, len(pred)), pred)
plt.xlabel('x')
plt.ylabel('y')
plt.show()
train_size = 5000
n_samples = train_size + 100
n_features = 10
n_classes = 5
x_data, y_data = make_classification(
n_samples=n_samples,
n_features=n_features,
n_informative=10,
n_redundant=0,
n_classes=n_classes
)
x_train = x_data[0:train_size]
y_train = y_data[0:train_size]
x_test = x_data[train_size:]
y_test = y_data[train_size:]
model = tf.keras.Sequential()
model.add(tf.keras.layers.LSTM(200,
return_sequences=True,
input_shape=(n_features, 1))
)
model.add(tf.keras.layers.LSTM(200))
model.add(Dense(n_classes, activation='softmax'))
model.compile(optimizer='Ftrl',
loss='sparse_categorical_crossentropy'
)
x_train = np.expand_dims(x_train, axis=2)
x_test = np.expand_dims(x_test, axis=2)
model.fit(
x_train,
y_train,
validation_data=(x_test, y_test),
epochs=10
)
if __name__ == '__main__':
y_pred = model.predict(x_test)
classes = np.argmax(y_pred, axis=1)
print(accuracy_score(y_test, classes))
plot_scatter(classes)
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
