Lambda optimization

Revision en5, by adamant, 2021-12-27 22:08:35

Hi everyone!

This time I'd like to write about what's widely known as "Aliens trick" (as it got popularized after 2016 IOI problem called Aliens). There are already some articles about it here and there, and I'd like to summarize them.


Lagrange multipliers (continuous case)

Let $$$f : \mathbb R^n \to \mathbb R$$$ be the objective function and $$$g : \mathbb R^n \to \mathbb R^c$$$ be the constraint function. The constrained optimization problem

$$$\begin{gather}f(x) \to \min\\ g(x)=0\end{gather}$$$

in some cases can be reduced to finding stationary points of the Lagrange function

$$$L(x, \lambda) = f(x) - \lambda \cdot g(x).$$$

Here $$$\lambda \cdot g(x)$$$ is the dot product of a variable vector $$$\lambda$$$, called the Lagrange multiplier, and $$$g(x)$$$. If $$$f$$$ and $$$g$$$ are continuously differentiable, for any solution $$$x^*$$$ of the initial problem there exists $$$\lambda^*$$$ such that $$$(x^*, \lambda^*)$$$ is a stationary point of $$$L(x, \lambda)$$$.

Geometrically, the idea behind Lagrange multipliers is that in the extreme point $$$x^*$$$, the tangent space to the constraint surface $$$g(x)=0$$$ should lie completely within the tangent space of the contour surface $$$f(x)=f(x^*)$$$. See the illustration below for 2D case.

For linear spaces, $$$A \subset B$$$ if and only if $$$B^\bot \subset A^\bot$$$ where $$$A^\bot$$$ is an orthogonal complement space of $$$A$$$. For a tangent space, its orthogonal complement is a normal space. For the surface $$$f(x)=f(x^*)$$$, normal is a line formed by the gradient vector $$$\nabla f$$$. And for the surface $$$g(x)=0$$$, normal is a linear space formed by the vectors $$$\nabla g_1 (x), \dots, \nabla g_c(x)$$$.

Thus, for the point $$$x^*$$$ to be the solution of constrained problem, $$$\nabla f(x^*)$$$ must be a linear combination of $$$\nabla g_1(x^*), \dots, \nabla g_c(x^*)$$$. This, in turn, is exactly the stationary point condition of $$$L(x, \lambda)$$$, specifically the $$$\lambda$$$ vector consists of coefficients of this linear combination.


Example problem

You're given an array $$$p_1, \dots, p_n$$$. You have to split it into $$$k$$$ contiguous segments such that the sum of

$$$(p_r - p_l) \cdot (r - l)$$$

over all segments in partition is maximized.

Let's formally state it as a constrained optimization problem. Let $$$X$$$ be a set of all partitions of $$$1..n$$$ into non-intersecting contiguous segments.

$$$L(i, \lambda) = L_\lambda(i) = \max_j (d(i, j) - \lambda \cdot j).$$$

This function is somewhat better because if $$$\lambda$$$ is fixed, the $$$L_\lambda(r)$$$ can be computed as

$$$L_\lambda(r) = \max \left\{ L_\lambda(r-1),~ \max\limits_{l=1..r} \left[L_\lambda(l-1) + (a_l + \dots + a_r)\right] - \lambda \right\}.$$$

If we introduce the prefix sum array $$$p_m = a_1 + \dots + a_m$$$, it can be rewritten as

$$$L_\lambda(r) = \max \left\{ L_\lambda(r-1),~ \max\limits_{l=1..r} \left[L_\lambda(l-1)- p_{l-1}\right] + p_r - \lambda\right\}.$$$

Thus, for this problem $$$L_\lambda(r)$$$ can be computed in $$$O(n)$$$ for all $$$r$$$. Note that $$$\lambda$$$ can influence how many segments are actually taken in the answer. With bigger $$$\lambda$$$, it is beneficial to take smaller number of segments and with smaller $$$\lambda$$$ it would be better to take bigger number of segments. What would be the actual meaning of $$$\lambda$$$ here?


Lagrange multipliers (discrete case)

In the previous section we had an assumption that $$$f(x)$$$ and $$$g(x)$$$ are continuously differentiable. How

References

Tags lambda, aliens, tutorial, lagrange, duality

History

 
 
 
 
Revisions
 
 
  Rev. Lang. By When Δ Comment
en68 English adamant 2023-10-11 00:47:07 8
en67 English adamant 2022-02-13 16:55:43 33
en66 English adamant 2022-01-04 14:57:40 25
en65 English adamant 2022-01-04 05:03:24 6363 + honorable mention
en64 English adamant 2022-01-04 04:20:18 8 articles
en63 English adamant 2022-01-04 04:18:39 354 example 3
en62 English adamant 2022-01-04 04:00:52 748 tldr structured
en61 English adamant 2022-01-04 03:39:52 33
en60 English adamant 2022-01-04 03:38:28 721 example, part 2
en59 English adamant 2022-01-04 03:25:48 784
en58 English adamant 2022-01-04 03:18:14 1414 example
en57 English adamant 2022-01-04 00:21:04 4
en56 English adamant 2022-01-04 00:20:45 570 better code for min_conv
en55 English adamant 2022-01-03 15:20:41 472 clarified tldr
en54 English adamant 2022-01-03 03:37:09 688 code for max-conv of concave functions
en53 English adamant 2022-01-03 01:11:10 43 link
en52 English adamant 2022-01-03 01:07:50 30
en51 English adamant 2022-01-03 01:06:56 12
en50 English adamant 2022-01-03 01:02:56 1815 + example
en49 English adamant 2022-01-02 20:14:13 160 sections in testing convexity
en48 English adamant 2022-01-02 13:29:02 129
en47 English adamant 2022-01-02 13:26:24 104 (published)
en46 English adamant 2022-01-02 13:21:47 3709
en45 English adamant 2022-01-02 12:55:16 690
en44 English adamant 2022-01-02 01:54:22 24
en43 English adamant 2022-01-02 01:53:53 710
en42 English adamant 2022-01-02 01:01:41 210
en41 English adamant 2022-01-02 00:54:43 643
en40 English adamant 2022-01-02 00:49:11 1627
en39 English adamant 2022-01-02 00:09:54 8779
en38 English adamant 2022-01-01 22:34:20 1161
en37 English adamant 2022-01-01 16:34:48 296
en36 English adamant 2022-01-01 16:33:24 4
en35 English adamant 2022-01-01 16:29:44 22
en34 English adamant 2022-01-01 16:27:27 131
en33 English adamant 2022-01-01 16:26:19 173
en32 English adamant 2022-01-01 16:23:52 31
en31 English adamant 2022-01-01 16:23:22 865
en30 English adamant 2021-12-29 05:34:22 1201
en29 English adamant 2021-12-29 03:47:08 332
en28 English adamant 2021-12-29 02:45:56 2366
en27 English adamant 2021-12-28 22:09:05 1383
en26 English adamant 2021-12-28 19:28:41 1585
en25 English adamant 2021-12-28 18:57:44 52
en24 English adamant 2021-12-28 18:44:26 5
en23 English adamant 2021-12-28 18:43:58 915
en22 English adamant 2021-12-28 18:28:53 112
en21 English adamant 2021-12-28 18:25:09 218
en20 English adamant 2021-12-28 18:18:48 1224 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en19 English adamant 2021-12-28 17:51:33 66
en18 English adamant 2021-12-28 17:50:21 316 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en17 English adamant 2021-12-28 17:35:06 409 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en16 English adamant 2021-12-28 17:18:07 81 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en15 English adamant 2021-12-28 17:14:07 1049 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en14 English adamant 2021-12-28 16:12:52 96 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en13 English adamant 2021-12-28 16:01:48 338 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en12 English adamant 2021-12-28 15:54:43 903 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en11 English adamant 2021-12-28 04:38:22 2 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en10 English adamant 2021-12-28 04:35:18 118 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en9 English adamant 2021-12-28 04:23:41 1406 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en8 English adamant 2021-12-28 03:18:55 333 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en7 English adamant 2021-12-28 02:59:30 322 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en6 English adamant 2021-12-28 01:40:50 2845 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en5 English adamant 2021-12-27 22:08:35 26 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en4 English adamant 2021-12-26 19:06:11 0 Tiny change: 'blem\n\n$$f(x)→ming(x)=0f(x)→ming(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en3 English adamant 2021-12-26 01:42:07 1888 Tiny change: 'blem\n\n$$f(x)→extrg(x)=0f(x)→extrg(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en2 English adamant 2021-12-25 16:27:58 176 Tiny change: 'blem\n\n$$f(x)→extrg(x)=0f(x)→extrg(x)=0\begin{gat' -> 'blem\n\n$$\begin{gat'
en1 English adamant 2021-12-25 16:21:30 2909 Initial revision (saved to drafts)