About 380,000 results
Open links in new tab
  1. Gradient descent - Wikipedia

    It is a first-order iterative algorithm for minimizing a differentiable multivariate function. The idea is to take repeated steps in the opposite direction of the gradient (or approximate gradient) of the …

  2. Gradient Descent Algorithm in Machine Learning

    Jul 11, 2025 · Gradient Descent is used to iteratively update the weights (coefficients) and bias by computing the gradient of the MSE with respect to these parameters. Since MSE is a convex …

  3. The idea of gradient descent is then to move in the direction that minimizes the approximation of the objective above, that is, move a certain amount > 0 in the direction −∇ ( ) of steepest …

  4. strained optimiza-tion problem: min f(x). x∈Rd For most of today we’l. also assume that f is diferentiable everywhere. A classical method to solve such optimization problems is gradient …

  5. Gradient Descent in Machine Learning: A Deep Dive - DataCamp

    Sep 23, 2024 · Gradient descent is one of the most important algorithms in all of machine learning and deep learning. It is an extremely powerful optimization algorithm that can train linear …

  6. Gradient Descent Explained: How It Works & Why It’s Key

    Feb 28, 2025 · Gradient Descent is the core optimization algorithm for machine learning and deep learning models. Almost all modern AI architectures, including GPT-4, ResNet and AlphaGo, …

  7. Implementing Gradient Descent from Scratch: A Step-by-Step …

    May 18, 2025 · What is Gradient Descent? Gradient Descent is an iterative optimization algorithm used to minimize a function by moving in the direction of the steepest descent, as defined by …

  8. What is gradient descent? - IBM

    Gradient descent is an optimization algorithm which is commonly-used to train machine learning models and neural networks. It trains machine learning models by minimizing errors between …

  9. Gradient Descent Unraveled - Towards Data Science

    Nov 14, 2020 · First, let us begin with the concepts of maxima, minima, global and local. I’ll explain these concepts for functions of a single variable because they are easy to visualize. …

  10. What is Gradient Descent - GeeksforGeeks

    Sep 29, 2025 · Gradient Descent is an iterative optimization algorithm used to minimize a cost function by adjusting model parameters in the direction of the steepest descent of the …