Next: Constrained Optimization
Up: Review of The Basics:
Previous: Review of The Basics:
Consider the unconstrained minimization problem
![$\displaystyle \min_{\alpha} E(\alpha)$](img1.png) |
|
|
(1) |
where
. We refer to
as the
design variable and to
as the cost functional.
A change in the design variables by
introduces a change
in the functional which can be written as
![$\displaystyle \delta E \equiv
E(\alpha + \epsilon \tilde \alpha ) - E(\alpha) =...
...rac{1}{2} \epsilon ^2 \tilde \alpha ^T {\cal H} \tilde \alpha + O(\epsilon ^3).$](img6.png) |
|
|
(2) |
Here
and
stands for the Hessian, i.e., the matrix
of second derivatives of E.
We assume the Hessian
is positive definite, i.e.,
for all
to guarantee a unique
minimum. For small
we can neglect second order terms and higher in
and see that a choice of
result in a reduction of the functional, that is,
![$\displaystyle E(\alpha - \epsilon \nabla E ) - E(\alpha) = - \epsilon \Vert \nabla E \Vert ^ 2 + O(\epsilon ^2).$](img13.png) |
|
|
(3) |
This is the basis for the steepest descent method and other
gradient based methods.
The gradient
of the functional
to be
minimized can be easily computed for this case, say, by finite
differences. At a minimum the following equations
hold,
![$\displaystyle \mbox{\tt Optimality Condition:} \qquad \frac{\partial E}{\partial \alpha _j} = 0 \qquad \qquad j=1, \dots, q.$](img15.png) |
|
|
(4) |
These equations are called the (first order) necessary conditions for the problem.
Next: Constrained Optimization
Up: Review of The Basics:
Previous: Review of The Basics:
Shlomo Ta'asan
2001-08-22