chromate00's blog

By chromate00, 15 months ago, In English

Before we begin: This blog is my attempt to Codeforces Month of Blog Posts. Simple rule: Write an interesting CodeForces blog post until February 15th and win $300.

In this blog, I will be describing the concept of Convex Optimization, and techniques that can be used to solve Convex Optimization tasks. The blog will be divided into two parts — "What is Convex Optimization", and "How do we solve Convex Optimization tasks". This part of the blog is for "What is Convex Optimization". (Part 2 is here)


What is Convex Optimization?

"A convex optimization problem is an optimization problem in which the objective function is a convex function and the feasible set is a convex set." as Wikipedia says. This may be hard to understand for many people, so let's use the simpler case of a univariate function. Often the "convex function" is called "unimodal function" in CP, and the tasks that use Ternary Search is very often a Convex Optimization task. Let's understand why using ternary search is possible in an univariate convex optimization task.

In the definition above, it is said that "the feasible set is a convex set" in convex optimization. The feasible set is the set of points where the solution can exist. A convex set in one dimension is basically equivalent to an interval. For Ternary Search we define $$$L$$$ and $$$R$$$, the limits where the solution can exist. Here, the feasible set is $$$[L,R]$$$, and of course this set is an interval, so the feasible set is a convex set.

Also citing from the definition of "unimodality" in Wikipedia, "a function f(x) is a unimodal function if for some value m, it is monotonically increasing for x ≤ m and monotonically decreasing for x ≥ m". This is exactly true for "convex (univariate) functions" as well, so convex functions are unimodal, and basically we can solve convex optimization tasks with ternary search.

Similarly to that on one dimension, convex functions and convex sets can be very well defined in multiple dimensions, and so we can expand this idea to multiple dimensions as well. In this blog I will not explain things in standard form, instead I will explain everything verbally and with simple equations. This is because understanding the standard form can be difficult for many people (I don't understand it either).


Important aspects of Convex Functions

There are some important aspects of convex functions that can help in proving the convexity in other tasks. I will use some of these aspects to prove the convexity of some convex optimization tasks later on.

  • The intersection of two or more convex functions (either minimum or maximum) is a convex function.

This can be understood intuitively; The intersection of two convex polygons is a convex polygon. The intersection of two intervals (convex sets on one dimension) is an interval, which is a convex set. The intersection of two convex function, is again a convex function. Do note though; the union of two convex function is not necessarily a convex function.

  • The sum of two or more convex functions is a convex function.

This can be proven mathematically, but I will not prove this here. There are basically tons of proof out there if you google it, so google it if you want proof on it.

  • If the objective function $$$f(x)$$$ is convex and is differentiable, then when $$$f'(x)=0$$$, $$$f(x)$$$ is either a maximum, a minimum, or a saddle point.

This is just basic math, but this serves as a basis of Newton's method in optimization. Not many convex optimization tasks have twice differentiable objective functions (as Newton's method in optimization requires), but it is worth knowing that this exists.

  • The distance to a certain point is a convex function, and also is its square.

This is easy enough to understand, and this is the most basic convex function in geometry tasks requiring convex optimization.

  • In convex functions, any local minima/maxima is the same as the global minima/maxima.

The exact essence of the reason why convex optimization can be solved easier than any other optimization tasks. I won't prove this formally, but thinking about why this works is a good practice.


Some famous convex optimization tasks, and their proof of convexity

Here are some famous convex optimization tasks as an example to understand convex optimization.

  • Smallest Enclosing Circle

This is a convex optimization task asking for the smallest circle enclosing a set of points. (Sometimes it is discussed about a set of circles, but here, we only discuss about the case of a set of points.) In this task it may be thought that we should determine both the center and the radius, but basically we can just determine the center and then the radius is the distance to the farthest point. So, the objective function for a set of points $$$S$$$ is as follows.

$$$\displaystyle f(x,y)=\max _{i \in S} \sqrt{(x-x_i)^2+(y-y_i)^2}$$$

The distance to each point is a convex function, and their intersection(max) is also a convex function. The feasible set is the set's convex hull, which is a convex set. Therefore, the Smallest Enclosing Circle is a convex optimization task. An $$$O(n)$$$ algorithm is known for this task (Megiddo or Welzl), but it can still be solved fast enough as a convex optimization task.

  • Geometric Median

This is a convex optimization task asking for a point where the sum of distance to all points in a set is minimized. The objective function for this task is simple (as simple as this task's definition).

$$$\displaystyle f(x,y)=\sum _{i \in S} \sqrt{(x-x_i)^2+(y-y_i)^2}$$$

The distance to each point is a convex function, and their sum is also a convex function. The feasible set is, again, the set's convex hull. Therefore, the Geometric Median problem is a convex optimization task.


The article will continue with methods to solve convex optimization tasks in the next part. Stay tuned!

  • Vote: I like it
  • +42
  • Vote: I do not like it

»
15 months ago, # |
  Vote: I like it +40 Vote: I do not like it

i upvtoed chromate ser