'How to use gridsearch to find best learning rate in python?
how can I use gridsearch to find the best learning rate for the gradientDescent?
def computeCost(X,y,theta,theta0):
#TODO
m = len(y)
J = 0
s = 0
#Iterative Solution
for i in range(0,m):
s = s + ((theta0 + np.dot(theta,X[i])) - y[i])**2
J = s/(2*m)
return J
def gradientDescent(X,y,theta,theta0,alpha,iterations):
#TODO
J_history = np.zeros((iterations,1))
m = len(y)
Y = y.reshape(m,1)
for iter in range(0,iterations):
s1 = 0
s0 = 0
for i in range(0,m):
s1 += ((theta0 + np.dot(theta,X[i])) - y[i])*X[i]
s0 += ((theta0 + np.dot(theta,X[i])) - y[i])
theta0 = theta0 - alpha*s0/m
theta = theta - alpha*s1/m
J_history[iter] = computeCost(X,y,theta,theta0)
return theta, J_history
Dataset : https://www.kaggle.com/uciml/pima-indians-diabetes-database
Solution 1:[1]
You can use sklearn.model_selection.GridSearchCV() to tune the learning rate. It is possible to define your estimator function(gradientDescent), parameters, and scoring function. See https://scikit-learn.org/stable/modules/generated/sklearn.model_selection.GridSearchCV.html for more information.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | M.Rahnama |
