Perturbation function

From HandWiki

In mathematical optimization, the perturbation function is any function which relates to primal and dual problems. The name comes from the fact that any such function defines a perturbation of the initial problem. In many cases this takes the form of shifting the constraints.[1] In some texts the value function is called the perturbation function, and the perturbation function is called the bifunction.[2]

Definition

Given two dual pairs of separated locally convex spaces [math]\displaystyle{ \left(X,X^*\right) }[/math] and [math]\displaystyle{ \left(Y,Y^*\right) }[/math]. Then given the function [math]\displaystyle{ f: X \to \mathbb{R} \cup \{+\infty\} }[/math], we can define the primal problem by

[math]\displaystyle{ \inf_{x \in X} f(x). \, }[/math]

If there are constraint conditions, these can be built into the function [math]\displaystyle{ f }[/math] by letting [math]\displaystyle{ f \leftarrow f + I_\mathrm{constraints} }[/math] where [math]\displaystyle{ I }[/math] is the characteristic function. Then [math]\displaystyle{ F: X \times Y \to \mathbb{R} \cup \{+\infty\} }[/math] is a perturbation function if and only if [math]\displaystyle{ F(x,0) = f(x) }[/math].[1][3]

Use in duality

The duality gap is the difference of the right and left hand side of the inequality

[math]\displaystyle{ \sup_{y^* \in Y^*} -F^*(0,y^*) \le \inf_{x \in X} F(x,0), }[/math]

where [math]\displaystyle{ F^* }[/math] is the convex conjugate in both variables.[3][4]

For any choice of perturbation function F weak duality holds. There are a number of conditions which if satisfied imply strong duality.[3] For instance, if F is proper, jointly convex, lower semi-continuous with [math]\displaystyle{ 0 \in \operatorname{core}({\Pr}_Y(\operatorname{dom}F)) }[/math] (where [math]\displaystyle{ \operatorname{core} }[/math] is the algebraic interior and [math]\displaystyle{ {\Pr}_Y }[/math] is the projection onto Y defined by [math]\displaystyle{ {\Pr}_Y(x,y) = y }[/math]) and X, Y are Fréchet spaces then strong duality holds.[1]

Examples

Lagrangian

Let [math]\displaystyle{ (X,X^*) }[/math] and [math]\displaystyle{ (Y,Y^*) }[/math] be dual pairs. Given a primal problem (minimize f(x)) and a related perturbation function (F(x,y)) then the Lagrangian [math]\displaystyle{ L: X \times Y^* \to \mathbb{R} \cup \{+\infty\} }[/math] is the negative conjugate of F with respect to y (i.e. the concave conjugate). That is the Lagrangian is defined by

[math]\displaystyle{ L(x,y^*) = \inf_{y \in Y} \left\{F(x,y) - y^*(y)\right\}. }[/math]

In particular the weak duality minmax equation can be shown to be

[math]\displaystyle{ \sup_{y^* \in Y^*} -F^*(0,y^*) = \sup_{y^* \in Y^*} \inf_{x \in X} L(x,y^*) \leq \inf_{x \in X} \sup_{y^* \in Y^*} L(x,y^*) = \inf_{x \in X} F(x,0). }[/math]

If the primal problem is given by

[math]\displaystyle{ \inf_{x: g(x) \leq 0} f(x) = \inf_{x \in X} \tilde{f}(x) }[/math]

where [math]\displaystyle{ \tilde{f}(x) = f(x) + I_{\mathbb{R}^d_+}(-g(x)) }[/math]. Then if the perturbation is given by

[math]\displaystyle{ \inf_{x: g(x) \leq y} f(x) }[/math]

then the perturbation function is

[math]\displaystyle{ F(x,y) = f(x) + I_{\mathbb{R}^d_+}(y - g(x)). }[/math]

Thus the connection to Lagrangian duality can be seen, as L can be trivially seen to be

[math]\displaystyle{ L(x,y^*) = \begin{cases} f(x) - y^*(g(x)) & \text{if } y^* \in \mathbb{R}^d_-, \\ -\infty & \text{else}. \end{cases} }[/math]

Fenchel duality

Let [math]\displaystyle{ (X,X^*) }[/math] and [math]\displaystyle{ (Y,Y^*) }[/math] be dual pairs. Assume there exists a linear map [math]\displaystyle{ T: X \to Y }[/math] with adjoint operator [math]\displaystyle{ T^*: Y^* \to X^* }[/math]. Assume the primal objective function [math]\displaystyle{ f(x) }[/math] (including the constraints by way of the indicator function) can be written as [math]\displaystyle{ f(x) = J(x,Tx) }[/math] such that [math]\displaystyle{ J: X \times Y \to \mathbb{R} \cup \{+\infty\} }[/math]. Then the perturbation function is given by

[math]\displaystyle{ F(x,y) = J(x,Tx - y). }[/math]

In particular if the primal objective is [math]\displaystyle{ f(x) + g(Tx) }[/math] then the perturbation function is given by [math]\displaystyle{ F(x,y) = f(x) + g(Tx - y) }[/math], which is the traditional definition of Fenchel duality.[5]

References

  1. 1.0 1.1 1.2 Radu Ioan Boţ; Gert Wanka; Sorin-Mihai Grad (2009). Duality in Vector Optimization. Springer. ISBN 978-3-642-02885-4. 
  2. J. P. Ponstein (2004). Approaches to the Theory of Optimization. Cambridge University Press. ISBN 978-0-521-60491-8. 
  3. 3.0 3.1 3.2 Zălinescu, C. (2002). Convex analysis in general vector spaces. River Edge, NJ: World Scientific Publishing  Co., Inc. pp. 106–113. ISBN 981-238-067-1. 
  4. Ernö Robert Csetnek (2010). Overcoming the failure of the classical generalized interior-point regularity conditions in convex optimization. Applications of the duality theory to enlargements of maximal monotone operators. Logos Verlag Berlin GmbH. ISBN 978-3-8325-2503-3. 
  5. Radu Ioan Boţ (2010). Conjugate Duality in Convex Optimization. Springer. p. 68. ISBN 978-3-642-04899-9.