### dragonslayerintraining's blog

By dragonslayerintraining, history, 6 months ago,

I finally figured out a nice way to implement dynamic convex hull with a reasonable constant factor! I expect this will be "well known in China" though.

Consider the problem of finding nested convex hulls in $O(n\log^2{n})$. The simplest way I know of is to make a convex hull data structure that supports point deletions, which is what I do here. It should be possible to extend this implementation to handle insertions as well.

This implementation is based on this paper by Overmars and van Leeuwen that describes a data structure for online fully dynamic convex hull in $O(\log^2{n})$ per operation. Directly implementing that is complicated and has a bad constant factor, so we'll do it a bit differently using ideas from FizzyDavid's solution to 1270H.

# The data structure

We need a data structure for points in the plane that can support

• deleting a point in $O(\log^2{n})$

• returning the convex hull of $k$ points in $O(k\log^2{n})$ time

The left and right hulls are handled separately. From now on, we'll only consider the left hull.

We can build the left hull using a divide and conquer strategy:

1. Split the points in half by y-coordinate.
2. Recursively build the hull for both halves
3. Find a "bridge" that connects the two hulls. This can be done in $O(\log{n})$ with a binary search.
4. Merge the two hulls with the bridge. Edges covered by the bridge are discarded.

To make this dynamic, we can turn this divide and conquer into a segment tree.

The paper explicitly stores the convex hull in a balanced binary search tree that supports split and concatenate in $O(\log^2{n})$. We'll instead implicitly represent the convex hull of each node in a segment tree.

Each node of the segment tree must store

• Pointers to its left and right children

• The bridge between the convex hulls of its children

• The bottommost point of its leaves

In a leaf node, we just pretend the bridge is between the point and itself.

Interestingly, computing the bridge can be done by querying its children in $O(\log{n})$.

# Finding the bridge efficiently

A node of the segment tree must be able to compute the bridge of its children's convex hulls efficiently.

The paper has $10$ cases, but I think we only need $6$.

We have two segment tree nodes $p$ and $q$ that we want to find the bridge between. Let $a$ and $b$ be the points of the bridge of the first node and $c$ and $d$ be the points of the bridge of the second node.

Repeat the following until both $p$ and $q$ are leaves.

• If $a\ne b$ ($p$ is not a leaf) and $a,b,c$ is in counterclockwise order, we can discard $b$ and all points above it. Set $p$ to its first child.

• If $c\ne d$ and $b,c,d$ is in counterclockwise order, we can discard $c$ and all points below it. Set $q$ to its second child.

• Otherwise, if $a=b$, we can discard $d$ and all points above it. Set $q$ to its first child.

• Otherwise, if $c=d$, we can discard $a$ and all points below it. Set $p$ to its second child.

• If we get here, $a$, $b$, $c$, $d$ are clockwise. The horizontal line through the bottommost point of $q$ separates the two hulls. Depending on which side of the line the intersection of lines $ab$ and $cd$ are, we can either discard $a$ and all points below it, or $d$ and all points above it. We either set $p$ to its second child or set $q$ to its first child.

After every iteration, we move either $p$ or $q$ to its children, so this process must terminate in $O(\log{n})$ iterations.

# Extracting the convex hull

To compute the convex hull, we define a recursive function that does the following:

• Given a node and two points $l$ and $r$ on the convex hull of the node, output the points on the convex hull between them, inclusive.

We just need to handle a few cases.

• If the bridge is contained between $l$ and $r$, we recurse on both children
• If the bridge is fully covered, we recurse on the child that isn't fully covered.

It is impossible for the bridge to be partially covered.

# Deletions

Deletions are kind of annoying because there is nothing that naturally acts as identity. I just removed all nodes with one child from the tree to avoid adding extra cases to the bridge-finding.

Code

# Questions

Another problem mentioned in the paper is maintaining points in the plane such that no point has larger $x$ and $y$ coordinates.

1270H mentioned above is similar.

What are other problems solvable this way?

Can all problems solvable by Overmars and van Leeuwen's technique be solved by this type of segment tree?

Is it possible to optimize all these to $O(n\log{n})$? Apparently it is possible for dynamic convex hull at least.

• +229

By dragonslayerintraining, 13 months ago,

# 1209A — Paint the Numbers

Author: MikeMirzayanov

Consider the smallest element $x$ in the array. We need to paint it in some color, right?

Observe, that we can paint all elements divisible by $x$ in that color as well.

So we can perform the following while the array is not empty:

• find the minimum element $x$,
• assign new color and remove all elements divisible by $x$

Complexity: $\mathcal{O}(n^2)$.

# 1209B — Koala and Lights

Author: FieryPhoenix

Because each individual light flashes periodically, all the lights together are periodic as well.

Therefore, we can simulate the lights up to the period to get the answer.

The actual period can be calculated as follows:

Spoiler

However, computing the actual period is not necessary and a very large number will work (like $1000$).

Challenge: Can we bound the time even more?

# 1209C — Paint the Digits

Author: MikeMirzayanov

The sequence must split into two non-decreasing where the end of the first $\le$ start of the second.

Let's bruteforce the value $x$, so that all elements $< x$ go to the color $1$, all elements $> x$ go to the color $2$, and for $=x$ we are not sure.

Actually, we can say that all elements equal to $x$, which go before the first element $> x$ сan safely go to the color $2$, while the rest can only go to the color $1$.

So we colored our sequence and we now only need to check whether this coloring is fine.

Complexity is $10 \cdot n$.

# 1209D — Cow and Snacks

Author: FieryPhoenix

Since every animal has exactly two favorite snacks, this hints that we should model the problem as a graph. The nodes are the snacks, and the edges are animals with preferences connecting snack nodes.

Let's consider a connected component of the graph with size greater than $1$. The first animal (edge) in that component to eat must take two snacks (nodes), all other snack nodes will be eaten by exactly one animal edge. It is always possible to find an order so that no other animals takes two snacks (for example, BFS order). Thus, a connected component with $c$ vertices can satisfy at most $c-1$ animals.

Let $N$ be the number of snacks, $M$ be the number of animals, and $C$ be the number of connected components (including those of size 1). The number of satisfied animals is $N-C$, so the number of of unhappy animals is $M-(N-C)$.

Complexity: $\mathcal{O}(n+m)$

# 1209E1 — Rotate Columns (easy version)

Authors: MikeMirzayanov and _kun_

There many approaches possible, let's describe one of them.

Let's change the problem to the following:

• Rotate columns any way you want.
• Select in each row one value, maximizing the sum.

This can be done with a dp, states are (prefix of columns, mask of taken columns). Basically at each step we are rotating the current column and fixing values for some of the rows.

The most simple way to write this makes $3^n \cdot m \cdot n^2$ (for every submask->mask transition iterate over all possible shifts and elements to consider in cost function).

But if we precalculate the cost function in advance, we will have $\mathcal{O}((3^n + 2^n n^2) \cdot m)$.

# 1209E2 — Rotate Columns (hard version)

The previous solution is slightly too slow to pass the large constraints.

Let's sort columns by the maximum element in them. Observe, that it is surely unoptimal to use columns which go after first $n$ columns in sorted order (we could've replaced them with some unused column).

So we can solve the hard version with previous solution in which we consider only best $n$ columns.

Complexity $\mathcal{O}((3^n + 2^n \cdot n^2) \cdot n + T(sort))$

# 1209F — Koala and Notebook

Author: dragonslayerintraining

First split all edges into directed edges with single digit labels, creating $\mathcal{O}(m\log{m})$ dummy vertices if necessary.

Since the first edge will not be zero (no leading zeros), longer paths are always greater. With a BFS, this reduces the problem to finding lexicographical minimal paths in a DAG.

To avoid needing to compare long sequences, we will instead visit all the vertices in order by their lexicographical minimal path. This can be done efficiently by something like BFS/DFS.

The main idea is to visit sets of vertices at a time. If we have a set of vertices whose minimal paths are $P$, we can find the set of vertices whose minimal paths are $P0$ by following all outgoing $0$ edges. Then, we find the set of vertices whose minimal paths are $P1$ by following all outgoing $1$ edges, and so on for all digits. Since we ignore vertices once they are visited, this is $\mathcal{O}(m\log{m})$

# 1209G1 — Into Blocks (easy version)

Author: Errichto

Let's solve easier version first (we will reuse core ideas in hard version as well).

Clearly, if some two integers are hooked'' like in $1 \ldots 2 \ldots 1 \ldots 2$, then they will end up being turned into the same integer.

So when we see integer $x$ with first occurrence at $a$ and last at $b$, let's mark segment $[a; b]$ as blocked. E.g. for array $[3, 3, 1, 2, 1, 2]$ we have not blocked only the bar between $3$ and $1$, that is we have $|3, 3 | 1, 2, 1, 2|$.

Now for every such segment we have to change all elements into a common element. So the answer for a segment is segment length minus the number of occurrences of the most frequent element.

One easy implementation is as follows: blocking is "+= 1 on segment" (can be done easily in $\mathcal{O}{(n + operations)}$ in offline, then for an element $x$, put the number of it's occurrences in the position of first occurrences.

Now we only need to compute max on disjoint segments, so it can be done in a naive way.

Complexity: $\mathcal{O}(n)$.

# 1209G2 — Into Blocks (hard version)

To adjust the solution for many queries we need to create some sophisticated data structure.

E.g. we all know that mentioned above "+= 1 on a segment" is easily done with a segtree. If we maintain for every value $a_i$ the corresponding set of occurrences, it's easy to update mentioned above number of occurrences in the first position''.

So what we need to do now? We need to dynamically recalculate the sum of minimums (and the set segments to calculate minimum can change quite much due to updates).

You probably also now that we can design a segtree which supports range increments and query (minimum, number of minimums) on the segment. In a similar way we can build a structure which returns (minimum, number of minimums, the sum of largest stored counts between minimums). Just maintain a few values in each node and do lazy propagation.

Complexity $\mathcal{O}(q \log n)$.

# 1209H — Moving Walkways

Author: Errichto

Some minor tips:

• Everything is a walkway. When there is no walkway, it is a walkway of speed $0$.
• You can increase all speeds by $1$ and assume that you own speed is in $[-1; +1]$
• Energy is an entity which is $\delta$ speed $\times$ time, which is distance.

Also if you spend $x$ energy per segment of len $l$ and speed $v$, it is not important how exactly you will distribute it over the walking process. In any way, you will walk by feet $l - x$ meters in time $(l - x) / v$.

So it turns out it's better to distribute more energy to low-speeded walkways (because the denominator is smaller).

Assume that you (by default) save up all energy on any non-feet path (for feet path it's always optimal to walk with speed $\ge 1$ ($\ge 0$ after speeds hack), so now save up's).

Build an energy graphic where the Ox axis will correspond to the point you are in (not time). It will be a piecewise linear function, so it is enough to store it's value only in points corresponding to points between walkways. Iterate over walkways in the order of speed and try to steal as much energy as possible to the current walkway.

What are the limits of stealing energy?

• there is a restriction based on $l$ and $v$ (if you take too much energy, you wouldn't be able to fully walk it up)
• the graphic must still be able above $0$ at all points.

The latter condition is just a suffix minima on a segment tree.

Complexity: $\mathcal{O}(n \log n)$.

Tutorial of 1600 - #-5

• +124

By dragonslayerintraining, history, 15 months ago,

# Question

Is there a data structure that supports $n$ of the following operations in $O(n\log{n})$?

1. Create a set containing a single integer in the range $[1,n]$.
2. Create the union of two sets, destroying the original sets in the process. The original sets may have overlapping ranges.
3. For any integer $x$ and a set, split the set into two sets: one with all elements less than $x$ and one with all elements greater than or equal to $x$.
4. For any integer $x$ and a set, add $x$ to all elements in the set. This operation is only allowed if all elements will remain in the range $[1,n]$.
5. Count the number of elements in a set.

# Partial Solutions

If operation 2 required the ranges to be disjoint, any balanced BST that supports split and join in $O(\log{n})$ will work.

Without operation 3, join-based tree algorithms will work since they support union in $O(m\log{\frac{n}{m}+1})$.

(Note that this is better than small to large, which takes $O(n\log^2{n})$ time total.)

Without operation 4, a version of segment trees will work.

# Full Solution

I was able to prove that binary search trees will solve the problem in $O(n\log^2{n})$. (Was this "well known" before?)

If we merge using split and join, this bound is tight. A set can be constructed by merging the set of even-index element and odd-index elements, which will take $O(n\log{n})$. Those two sequences could have been constructed the same way recursively. This will take $O(n\log^2{n})$ time for $O(n)$ operations.

However, I conjecture that if we use join-based tree algorithms, it will actually be $O(n\log{n})$ in the worst case. (For example, in the case above, merges take $O(n)$.)

Can anyone prove or disprove this conjecture for any join-based tree algorithm?

If anyone can prove it for splay trees instead, or have another solution, I would also be interested.

Thanks.