'Target shows Zero values in Bayesianoptimization For Regression Neural Network
I am writing a code to optimize for my regression neural Network. However i realize that my targets values are giving zeros all through. I cant say what is wrong. May be there is a mistake in the code. I was wondering if this is correct or am i missing anything in the code
# Create function
def nn_cl_bo(neurons, activation, optimizer, learning_rate, batch_size, epochs ):
optimizerL = ['SGD', 'Adam', 'RMSprop', 'Adadelta', 'Adagrad', 'Adamax', 'Nadam', 'Ftrl','SGD']
optimizerD= {'Adam':Adam(lr=learning_rate), 'SGD':SGD(lr=learning_rate),
'RMSprop':RMSprop(lr=learning_rate), 'Adadelta':Adadelta(lr=learning_rate),
'Adagrad':Adagrad(lr=learning_rate), 'Adamax':Adamax(lr=learning_rate),
'Nadam':Nadam(lr=learning_rate), 'Ftrl':Ftrl(lr=learning_rate)}
activationL = ['relu', 'sigmoid', 'softplus', 'softsign', 'tanh', 'selu',
'elu', 'exponential', LeakyReLU,'relu']
neurons = round(neurons)
activation = activationL[round(activation)]
batch_size = round(batch_size)
epochs = round(epochs)
#Build model architecture
def nn_cl_fun():
opt = Adam(lr = learning_rate)
nn = Sequential()
nn.add(Dense(neurons, input_dim=11, kernel_initializer='normal', activation=activation))
nn.add(Dense(neurons, kernel_initializer='normal', activation=activation))
nn.add(Dense(1, kernel_initializer='normal'))
nn.compile(loss = 'mean_squared_error', optimizer=opt, metrics=['accuracy'])
return nn
es = EarlyStopping(monitor='accuracy', mode='max', verbose=0, patience=20)
nn = KerasRegressor(build_fn=nn_cl_fun, epochs=epochs, batch_size=batch_size,
verbose=0)
kfold = KFold(n_splits=5, shuffle=True, random_state=123)
score = cross_val_score(nn, X_train, y_train, scoring= score_acc, cv=kfold, fit_params={'callbacks':[es]})
score=np.nan_to_num(score)
score=score.mean()
return score
# Make scorer accuracy
score_acc = make_scorer(accuracy_score)
# Set paramaters
params_nn ={
'neurons': (10, 100),
'activation':(0, 9),
'optimizer':(0,7),
'learning_rate':(0.01, 1),
'batch_size':(100, 1000),
'epochs':(20, 100)
}
# Run Bayesian Optimization
nn_bo = BayesianOptimization(nn_cl_bo, params_nn, random_state=111)
nn_bo.maximize(init_points=25, n_iter=4)
Resultant table is below:
|iter |target |activa.| batch_ | epochs | learni |neurons|optimizer|
| ----------------------------------------------------------------------|
| 1 | 0.0 | 5.51 | 252.2 | 54.88 | 0.7716 | 36.58 | 1.044 |
| 2 | 0.0 | 0.202 | 478.2 | 39.09 | 0.3443 | 99.16 | 1.664 |
| 3 | 0.0 | 0.730 | 702.6 | 69.7 | 0.2815 | 51.96 | 0.8286 |
...
The code was edited from https://www.analyticsvidhya.com/blog/2021/05/tuning-the-hyperparameters-and-layers-of-neural-network-deep-learning/ Author: Rendyk
Solution 1:[1]
When you build the image, choose an image name and tag that you can remember later. Jenkins provides several environment variables that you can use to construct this, such as BUILD_NUMBER; in a multibranch pipeline you also have access to BRANCH_NAME and CHANGE_ID; or you can directly run git in your pipeline code.
def shortCommitID = sh script: 'git rev-parse --short HEAD', returnStdout: true
def dockerImage = "project:${shortCommitID.trim()}"
def registry = 'registry.example.com'
def fullDockerImage "${registry}/${dockerImage}"
Now that you know the Docker image name you're going to use everywhere you can just use it; you never need to go off and look up the image ID. Using the scripted pipeline Docker integration, for example:
docker.withRegistry("https://${registry}") {
def image = docker.build(dockerImage)
image.push
}
Since you know the registry/image:tag name, you can just use it in other Jenkins directives too
def containerName = 'container'
// stop an old container, if any
sh "docker stop ${containerName} || true"
sh "docker rm ${containerName} || true"
// start a new container that outlives this pipeline
sh "docker run -d --name ${containerName} ${fullDockerImage}"
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 |
