Hello everyone, I found this problem a while ago and I'm still stuck on it (there's no OJ to submit it as far as I know):

You are given a permutation of the integers from 1 to *N*. In this case, *moving* an element from *i* to *j* is defined as removing the element from position *i*, and inserting it at position *j*. For example, in this array: `[4,5,2,1,3]`

, moving the element from 2 to 5 gives us the array `[4,2,1,3,5]`

. The cost of moving an element from *i* to *j* is defined as *i* + *j* (1-indexed).

What is the minimum total cost needed to sort the permutation?

Example:`[1,3,4,5,2]`

→ `[1,2,3,4,5]`

with total cost 5 + 2 = 7.

1 ≤ *N* ≤ 1000, however solutions with any reasonable complexity are welcome.

Auto comment: topic has been updated by f2lk6wf90d (previous revision, new revision, compare).Auto comment: topic has been updated by f2lk6wf90d (previous revision, new revision, compare).Auto comment: topic has been updated by f2lk6wf90d (previous revision, new revision, compare).Auto comment: topic has been updated by f2lk6wf90d (previous revision, new revision, compare).CodeHere's a

brute-force helperif anyone wants toplay aroundand try to spot a pattern. Interestingly, the worst case scenario is not alwaysn,n- 1, ..., 2, 1.The next step would perhaps be writing some greedy algorithm and comparing the outputs?

I've already tried this :(

Here's some code (uses Floyd-Warshall instead) that generates a shortest-path tree that is readable by

`graphviz`

for n = 5:SpoilerThe graph is symmetric so you don't really need to do all-pairs shortest paths

Yes, I know it's a huge overkill. I originally wanted to see if there was a pattern when transforming any permutation to any other permutation.

Worst cases (all with the largest cost listed):

Searching OEIS for the sequence of costs was not fruitful.

Here's a shortest path tree with weights on the edges: http://svgshare.com/i/4hW.svg

I have a feeling that the answer with minimum cost is also any answer that uses the minimum number of moves, could you confirm this?

Auto comment: topic has been updated by f2lk6wf90d (previous revision, new revision, compare).It feels like there are two cases.

`[2, 3, 4, 5, 1]`

.In the first case, let's have a look at the optimal solution and all the moves it makes before moving the largest element

x(into its final position). Consider the parts to the left and right ofxrespectively, call themHLandHR. We either have moves withinHL, withinHR, or betweenHLandHR. If we movexto its final position directly (i.e. to the end of the list), the moves withinHLare unaffected, the moves withinHRbecome cheaper (!) as the index for each element decrease by 1, and the moves betweenHLandHRalso become cheaper.So if we are going to move the largest element, we might just as well do it directly. After we do this, we end up with the same problem but of size

N- 1, solve recursively.In the second case, and this is where I'm unsure and feel I'm almost guessing (or going on intuition). Consider again the two halves

HLandHR. Notice that there won't be any moves fromHLtoHR, as that would mean that we would later need to movex, which would put us in case 1. Perform the moves to getHLsorted, this reduces to the same problem but of some sizeM<N(solve recursively). For each elementyinHR, going left to right, move it into its final position inHL.So all in all, we just try both cases and pick the better one. The complexity should be quadratic.

But I'm probably mistaken, so please do give me a counter-example showing where the above approach goes wrong (if it is wrong). We might all learn something from it, and be one step closer to the real solution. :)

Could you give an example for

`[2,4,1,5,3]`

?So yeah for

`[2,4,1,5,3]`

the optimal solution falls into the second case.x= 5,HL= [2, 4, 1] andHR= [3].HLrecursively. Solving (i.e. sorting)`[2,4,1]`

is equivalent to solving`[2,3,1]`

, whose optimal solution also happens to be of case 2.x= 3,HL= [2] and HR = [1].HLrecursively. We hit our base case and return cost 0.HRinto its final position inHL. So we move 1 to the front, which costs 1 + 3 = 4.`[2,3,1]`

using case 1, but that doesn't give a result better than 4. (Using case 1 will yield a cost of 8.)x= 5,HL= [1, 2, 4] andHR= [3].HRinto its final position inHL. So we move 3 to the third position, which costs 3 + 5 = 8.`[2,4,1,3]`

, which isn't better than 12, so 12 is the optimal answer.(I wish I could do proper sub-bullets to illustrate the recursive steps better, but yeah...)

This sounds good. If you'd like to post an implementation of your idea too, testcases are here. Note: You have to compress the integers in the input first (they are all distinct), and sort in descending order instead, so you have to set every element of the permutation

itoN-i+ 1.So the first test case is, if I'n not mistaken, equivalent to

`[4,5,1,3,2]`

. The output says that the answer is 11, whereas my code, your graph, and my brain says it's 17.Or am I reading the input incorrectly?

The first test case is equivalent to

`[2,1,5,3,4]`

, since we're sorting in descending order instead.OK then all the results match up.

So my Java solution runs in 16 sec on the last test case. I could probably cut it down to below 10 sec with some custom data structures (instead of ones requiring auto-boxing and so on), or even better rewrite it in C++. ^^

So yeah, on to figuring out how to make the algorithm faster...

OK so when I need to calculate the number of elements less than some element

y, instead of using a`TreeSet<Integer>`

and`headSet(y)`

, I used a custom segment tree in an`int[]`

array, and all of a sudden the code ran in < 2 sec. -_-For

N= 1000? That's very interesting! Could you share your implementation?Sure here it is: https://pastebin.com/eZFY0KHv

Not that I think it's too much help, since it's full of hacks to speed up the execution. The outline above should be more understandable.

Thank you!

So one place where the reasoning goes wrong, is when it comes to the time complexity.

I thought that we'd only have

O(N) dp-states, and that therefore the complexity wasO(N^{2}). However this is not true. First of all I didn't account for the normalization, which unless you use counting sort or the like, will account for a factor. The factor also shows up when we need to calculate the position inHLwhere we will move an element ofHR.But more importantly, for the above algorithm, as stated, there are not

O(N) dp-states butO(N^{2}) dp-states. More precisely, the number of states is equivalent to the number of distinct prefixes of our input (after normalization) you can get by removing theqlargest elements, for someq. In practice this number has a pretty good constant, because if the i:th largest element has positionQ_{i}, then its removal will only affect prefixes of length at leastQ_{i}. And the normalization also makes it so that the dp-state changes only if the removed element causes a change in the "relative order".So even if the total complexity becomes , an efficient implementation has a good chance of making a time limit around a few seconds.

I have a few points. Not sure how useful, but here it goes :

Suppose we make a move (i,j) where i < j, then we call this move "ltr", otherwise we call it "rtl".

a) i1 < i2, j2 < j1, i1 < j1, i1 < j2. Then, we must make (i1,j1) before (i2,j2).

b) i1 < i2, j1 < j2, i2 < j1. Then, order of making these moves doesn't matter.

Similarly, we can find rules for rtl moves.

You are referring to problem "Sorting" in BOI 2007 (solution, judge)

You gave a wrong link

Fixed. Thanks for noticing this

Thanks!!

I had no idea this was a duplicate problem. This was a problem from the Greek team selection contest in 2013. Looking up reformulations of the statement didn't return anything relevant. Thank you!