Hello! Codeforces , This is an new algorithm of sorting by combining two idea's(Merge sort and square root decomposition).
Idea : Lets say n is the size of the array. Lets define some integer m, and divide our array chunks of size m. We know the number of chunks will be n/m . So lets sort all these small chunks and than using the idea of merge sort merge all of the sorted n/m chunks.
Now here is a point while merging our solutions we don't have to merge them all at once. Why is that? A simple analysis is that for every position I am spending O(n/m) time so total O(n^2 / m). But can we some how do this better to have low comparisons. And the Answer is YES.
But How? Just take the first two of the chunks and merge them. Then take the next two and so on ... . Then with these merged chunks repeat the operation until we have only one chunk which of size n. Now how does this improves. Now every time my number of chunks are dividing by 2 and the size of each one of the chunks is getting bigger by a factor of 2.
Now lets see how many time the size of the chunk that contains the number i increase. So as our growth is exponential this gives us the time of (lg (n/m)). So the total for n elements it's time of O(n lg(n/m)). Now this is Huge better.
Now we wanted to see what is the total runtime.
The Recursive formula for this is
Now what is the value of m can we give to this. Lets use the idea of square root decomposition ans say m = √n.
What will be the runtime : O(n lg n)
How this is solve see this.
Hope you like this algorithm.
We can also choose m = n/2 , but than this is merge sort(But world wanted something different).
Also the credit of computing the runtime goes to secretno_botaem.
Well, I want to say that you'd better use $$$\log$$$ instead of $$$\lg$$$ because it mean $$$\log_{10}$$$.
You can write it like this by put
\log
between two$
.all logs are the same in big-O notation
So we conclude
So what is the advantages of this algorithm?
I mean what kind of input can make this algorithm quicker than other $$$n\log n$$$ algorithms?
If some can come up with a way of merging all the O(√n) in O(n) time than we have the best ever algorithm of sorting which take time of O(n lg(lg n)). Which is still a mystery . We know counting sort take O(n+k) but having O(k) in the time is not good. So it's a challenge for every one to find a way of merging in O(n) and change the world.
Comparison-based sorting is impossible to do better than $$$\Theta(n \log n)$$$.
Is there any proof for it? I really don't know about that. Can you share its proof?
yes there is an algo you can check it on google
To clarify, this only applies to sorting algorithms that only compare elements: radix sort and counting sort get around that because they do other stuff with the elements too.
But we assume that the only way the algorithm can know which of two elements is bigger is to call a comparator function.
Say you want to sort an array of $$$n$$$ elements. There can be up to $$$n!$$$ different orders this array can be in. Say $$$k$$$ is the number of comparisons we make in the worst case. If $$$2^k < n!$$$, then it is impossible to tell some of these orders apart. Thus we must have $$$2^k \ge n!$$$. Take the logarithm, we see that $$$k \ge \log n!$$$. It is known (Stirling's approximation) that $$$\log n! = \Omega(n \log n)$$$, so we must make $$$\Omega(n \log n)$$$ comparisons in the worst case.
So basically a hybrid merge sort? I mean, it seems cool on paper, but if you look at it closely, it is basically merge sort, but if the array size get smaller than $$$m$$$, you use another sorting algorithm.
sounds like merge sort with extra steps