Gradient Descent

(Algorithm)

Logo of Gradient descent
skills
Algorithm
Link to Dbpedia

What is Gradient descent?

Gradient descent is a first-order iterative optimization algorithm for finding a local minimum of a differentiable function. To find a local minimum of a function using gradient descent, we take steps proportional to the negative of the gradient (or approximate gradient) of the function at the current point. But if we instead take steps proportional to the positive of the gradient, we approach a local maximum of that function; the procedure is then known as gradient ascent. Gradient descent is generally attributed to Cauchy, who first suggested it in 1847, but its convergence properties for non-linear optimization problems were first studied by Haskell Curry in 1944.

Technology Types

actactivityalgorithmeventfirst order methodgradient methodknow-howmathematical optimizationmethodoptimization algorithms and methodprocedurerule

Synonyms

Gradient ascentGradient descent methodGradient descent optimizationSteepest ascentSteepest descent

Translations

Algorisme del gradient descendent (ca)Algorithme du gradient (fr)Discesa del gradiente (it)Gradientenverfahren (de)Metoda gradientu prostego (pl)Método do gradiente (pt)Penurunan gradien (in)Градиентный спуск (ru)Градієнтний спуск (uk)경사 하강법 (ko)最急降下法 (ja)梯度下降法 (zh)

Tech Info


Source: [object Object]
 — Date merged: 11/6/2021, 1:32:40 PM
 — Date scraped: 5/20/2021, 3:46:02 PM