Subgradient

Subgradient

Das Subdifferential ist eine Verallgemeinerung des Gradienten. Für eine Funktion f:\mathbb{R}^n\rightarrow\mathbb{R} ist das Subdifferential im Punkt \bar x gegeben durch

\partial f(\bar x)=\{g\in\mathbb{R}^n:f(x)-f(\bar x)\geq g^{T}(x-\bar x) für alle x\in\mathbb{R}^n\}

Die Elemente g\in\partial f(\bar x) heißen Subgradienten.

Beispiel

Subgradienten einer konvexen Funktion

Das Subdifferential von der Funktion f:\mathbb{R}\rightarrow\mathbb{R}:x\mapsto|x| ist gegeben durch:

\partial f(\bar x)=\begin{cases}\{-1\} & \bar x<0\\
\left[-1,1\right] & \bar x=0\\ \{1\} & \bar x>0\end{cases}


Beschränktheit

Sei f:\mathbb{R}^n\rightarrow\mathbb{R} stetig und sei X\subset\mathbb{R}^n beschränkt. Dann ist die Menge \bigcup_{\bar x\in X}\partial f(\bar x) beschränkt.

Beweis

Sei f:\mathbb{R}^n\rightarrow\mathbb{R} stetig und sei X\subset\mathbb{R}^n beschränkt. Setzte \epsilon:=\sup |f(\overline{U_1(X)})| wobei \overline{U_1(X)}=\{x\in\mathbb{R}^n\,|\,{\rm dist}(x,X)\leq1\}. Angenommen \bigcup_{\bar x\in X}\partial f(\bar x) ist nicht beschränkt, dann gibt es für R: = 2ε ein \bar x\in X und ein g\in\partial f(\bar x) mit \|g\|_2>R=2\epsilon. Sei x:=\frac{1}{\|g\|_2} g+\bar x. Somit sind \bar x,x\in\overline{U_1(X)}. Wir erhalten die Abschätzung

g^T(x-\bar x)=\frac{1}{\|g\|_2}g^T g=\|g\|_2 > 2\epsilon\geq\left|f(x)-f(\bar x)\right|\geq f(x)-f(\bar x) .

g ist also kein Subgradient. Das ist ein Widerspruch.

\Box

Wikimedia Foundation.

Игры ⚽ Поможем написать курсовую

Schlagen Sie auch in anderen Wörterbüchern nach:

  • Subgradient method — Subgradient methods are algorithms for solving convex optimization problems. Originally developed by Naum Z. Shor and others in the 1960s and 1970s, subgradient methods can be used with a non differentiable objective function. When the objective… …   Wikipedia

  • subgradient — noun subderivative …   Wiktionary

  • Convex optimization — Convex minimization, a subfield of optimization, studies the problem of minimizing convex functions over convex sets. Given a real vector space X together with a convex, real valued function defined on a convex subset of X, the problem is to find …   Wikipedia

  • Subderivative — In mathematics, the concepts of subderivative, subgradient, and subdifferential arise in convex analysis, that is, in the study of convex functions, often in connection to convex optimization. Let f : I →R be a real valued convex function defined …   Wikipedia

  • Naum Z. Shor — Naum Zuselevich Shor Born 1 January 1937(1937 01 01) Kiev, Ukraine, USSR Died 26 February 2006(2006 02 26 …   Wikipedia

  • Mathematical optimization — For other uses, see Optimization (disambiguation). The maximum of a paraboloid (red dot) In mathematics, computational science, or management science, mathematical optimization (alternatively, optimization or mathematical programming) refers to… …   Wikipedia

  • Ellipsoid method — The ellipsoid method is an algorithm for solving convex optimization problems. It was introduced by Naum Z. Shor, Arkady Nemirovsky, and David B. Yudin in 1972, and used by Leonid Khachiyan to prove the polynomial time solvability of linear… …   Wikipedia

  • Newton's method — In numerical analysis, Newton s method (also known as the Newton–Raphson method), named after Isaac Newton and Joseph Raphson, is a method for finding successively better approximations to the roots (or zeroes) of a real valued function. The… …   Wikipedia

  • Cutting-plane method — In mathematical optimization, the cutting plane method is an umbrella term for optimization methods which iteratively refine a feasible set or objective function by means of linear inequalities, termed cuts. Such procedures are popularly used to… …   Wikipedia

  • Subdifferential — Das Subdifferential ist eine Verallgemeinerung des Gradienten. Inhaltsverzeichnis 1 Definition 2 Beispiel 3 Beschränktheit 3.1 Beweis …   Deutsch Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”