<< Chapter < Page
  Signal theory   Page 1 / 1
Chapter >> Page >
Describes second order conditions in local optimization to find maxima and minima.

When the objective function and constraints f : R n R , g i : R n R , it is easy to check whether an extremum x 0 is a maximum or a minimum of the functional. We appeal to second-order differentials, known as Hessians.

Definition 1 The n × n Hessian matrix F ( x ) for the functional f ( x ) : R n R has entries

F ( x ) i j = 2 f ( x ) x i x j , i , j = 1 , 2 , , n .

Lemma 1 Let L ( x ) be the Hessian of the Lagrangian L ( x , λ ) and let x 0 be an extremum. If d T L ( x 0 ) d 0 for all d T ˜ Ω ( x 0 ) , then x 0 is a minimizer. If d T L ( x 0 ) d 0 for all d T ˜ Ω ( x 0 ) , then x 0 is a maximizer.

Example 1 Find the extremum of f ( x ) = x 2 2 + x 2 x 3 + x 1 x 3 subject to x 1 + x 2 + x 3 = 3 and determine whether it is a maximum or a minimum.

To begin, we write the optimization's equality constraint:

g ( x ) = x 1 + x 2 + x 3 - 3

The objective function can be written in the form

f ( x ) = i j a i j x i x j = x T A x ,

where the matrix A has entries given by A i j = a i j . Thus, for our example the resulting matrix is

A = 0 0 0 0 1 1 1 0 0 .

The gradient for this function is given by

f ( x ) = ( A + A * ) x = 0 0 1 0 2 1 1 1 0 x = B x ,

where B = A + A * . We can also rewrite the inequality constraint as g ( x ) = 1 T x - 3 , where 1 denotes a vector with entries equal to one of appropriate size. Therefore, its gradient is equal to g ( x ) = 1 . The resulting gradient of the Lagrangian is set to zero to obtain the solution:

f ( x ) + λ g ( x ) = 0
B x + λ 1 = 0
B x = - λ 1
x = - λ B - 1 1 = - λ 0 - λ

Solve for λ from 1 ̲ T x = 3 to obtain λ = - 3 / 2 . Therefore, the optimization's solution is

x 0 = 3 / 2 0 3 / 2 .

We can solve for the Hessians of F ( x ) and G ( x ) :

F ( x ) = ( A + A * ) = 0 0 1 0 2 1 1 1 0 , G ( x ) = 0 0 0 0 0 0 0 0 0 .

We therefore obtain that the Hessian of the Lagrangian is equal to

L ( x 0 ) = F ( x 0 ) + λ G ( x 0 ) = 0 0 1 0 2 1 1 1 0 = B .

At this point we need to check if the product h T L ( x 0 ) h is positive or negative for all h T ˜ Ω ( x 0 ) , the tangent space defined as

T ˜ Ω ( x 0 ) = h : g ( x ) , h = 0 = h : 1 , h = 0 .

It is easy to see that h T ˜ Ω ( x 0 ) if and only if h 1 + h 2 + h 3 = 0 . To begin, we check whether the eigenvalues of L ( x 0 ) are all positive or negative: a calculation returns { - 1 . 1701 , 0 . 6889 , 2 . 4812 } . Since neither case occurred, we have to specifically consider the case in which h 1 + h 2 + h 3 = 0 :

h T L ( x 0 ) h 0 , i , j = 1 3 B i j h i h j 0 , 2 h 2 2 + 2 h 2 h 3 + 2 h 1 h 3 0 , h 2 2 + h 3 ( h 1 + h 2 ) 0 , h 2 2 - h 3 2 0 .

It turns out that we can find h T ˜ Ω ( x 0 ) for which the value on the left hand side may be positive or negative. Therefore, this is neither a maximum or a minimum, and we have found an inflection point.

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Signal theory. OpenStax CNX. Oct 18, 2013 Download for free at http://legacy.cnx.org/content/col11542/1.3
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Signal theory' conversation and receive update notifications?

Ask