Shameek's blog

By Shameek, 11 months ago, In English

question — Kuroni and Cowsheds

link to question —

can someone please explain the solution?

I couldnt do better than Q*N basically go L to R + dsu

solution —

  • Vote: I like it
  • -7
  • Vote: I do not like it

11 months ago, # |
  Vote: I like it +14 Vote: I do not like it

I couldn't solve it myself but I'll explain a solution I've heard from someone else.

Throughout all the queries combined, observe that the merging of two different sets of DSU happens N-1 times at max. Now let's look at a single query [L, R]. Let's identify all the elements getting merged as a_1, a_2, ..., a_2k, in order. As we've observed, the sum of k over all queries is less than N. These elements partition the query into 2k+1 parts, where i th part is identical to the reversed 2k+2-i th part. Instead of doing a linear search, we can just use binary search + hash (using the label of DSU) to find each of these parts in O((logN)^2). Since there are O(N+Q) such parts throughout all the queries, our final complexity is O((N+Q)(logN)^2). Code.

  • »
    11 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    in your code why are you checking for the dsu merge as well?

    while(l < r && dsu.merge(l, r — 1, tr))

    once we get the length from where we cn start merging i.e. L + len and R — len are in same component dont we just merge till they meet?

    • »
      11 months ago, # ^ |
      Rev. 2   Vote: I like it 0 Vote: I do not like it

      It could be the case that there are multiple isolated merges separated by intervals which don't get merged. We have to repeat "(1) Binary search to find the maximal interval which don't get merged (2) Merge relevent elements" this two processes multiple times until we're done querying [L, R].