### ggdwbg's blog

By ggdwbg, history, 3 years ago,

Okay, so I'm going to write about a new sorting algorithm which I discovered recently while playing with sqrt decomposition.

I've told 4 people I know about it and in 4 out of 4 cases their reaction was "man, you're a genius, how could nobody actually come up with this before?!".

Well, no, I'm just kidding, 4 out of 4 asked what is it for and told me that I'm just wasting my time coming up with useless crap, BUT I am going to write about it nevertheless.

The idea is simple. You just do sqrt decomposition for RMQ. Suppose $\mathrm{BlockSize}$ is (wow) your blocksize and $\mathrm{BlockMin}[k]$ is the index of minimum element in $k$-th block. Then in every iteration, suppose you already have prefix $[0, s)$ sorted. Then what you do is you find the index of minimum in blocks from $\mathrm{B} := 1 + \left \lfloor \frac{s}{\mathrm{BlockSize}} \right \rfloor$ to the last one, call that index $\mathrm{min}_1$. Minimum could also be in the block containing $s$, so you also iterate in $[s, \mathrm{BlockSize} \cdot \mathrm{B})$, finding index of the min-element $\mathrm{min}_2$, then you compare $a[\mathrm{min}_1]$ with $a[\mathrm{min}_2]$ and pick a minimum of them. Then in your array you swap $a[\mathrm{min}]$ with $a[s]$ and update $\mathrm{BlockMin}\left[\left \lfloor \frac{\mathrm{min}}{\mathrm{BlockSize}} \right \rfloor\right]$ if necessary (if $\left \lfloor \frac{\mathrm{min}}{\mathrm{BlockSize}} \right \rfloor \geq B$). Then you now have $[0, s+1)$ sorted and all invariants maintained. $\blacksquare$

Setting $\mathrm{BlockSize} = \left \lceil \sqrt{n} \right \rceil$ yields $\mathcal{O}(n^{3/2})$ run time. This algorithm is not in-place, $\mathcal{O}(\sqrt{n})$ additional memory is required.

Actually, there's an interesting implication of such $\left( \mathrm{data \; structure} \Rightarrow \mathrm{sorting \; algorithm} \right)$ argument. Suppose you have a data structure that uses only comparisons and can do updates and queries in true $\mathcal{O}(f(n))$. Then this instantly gives you a $\mathcal{O}(n f(n))$ time sorting algorithm, and if $f(n)$ is asymptotically strictly smaller than $\log n$, then such data structure cannot exist, because of the lower bound for sorting.

So this can actually be considered a proof of optimality of segment trees.

• +45