Gradient Descent and Convergence Analysis

IMPORTANT: This note is translated by LLM and fixed manually. See Chinese version for a more accurate note 1. General Descent Methods 1.1 Basic Form of Descent Methods Note: Descent methods do not require convexity, but convexity provides significant guarantees for solving optimization problems. The prototype of gradient descent is the descent method. A descent algorithm generates a sequence of optimization points $x^{(k)}, k=1, \cdots$, where $$ x^{(k+1)} = x^{(k)} + t^{(k)}\Delta x^{(k)} $$and $t^{(k)} > 0$ (unless $x^{(k)}$ is already optimal). ...

December 11, 2025 · 8 min