# 'Partial Derivative term in the Gradient Descent Algorithm

I'm learning the "Machine Learning - Andrew Ng" course from Coursera. In the lesson called "Gradient Descent", I've found the formula a bit complicated. The theorem is consist of "**partial derivative**" term.

The problem for me to understand the calculation of partial derivative term. Thus, later the term is calculated as

1/m * ∑ (h θ (x) − y(i) )²

My question is, "How did the **1/2m** from the 'Cost Function' becomes **1/m** while calculating the partial derivative inside the **Gradient Descent** theorem?"

## Solution 1:^{[1]}

Differentiation of `x²`

is `2x`

.
Similarly, differentiation of `??(h ??(x) ? y(i) )²`

is `2 * ??(h ??(x) ? y(i) )`

.
Therefore, differentiation of `1/2m * ??(h ??(x) ? y(i) )²`

is `1/m * ??(h ??(x) ? y(i) )`

.

## Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution | Source |
---|---|

Solution 1 | AHMED AGHADI |