### Shameek's blog

By Shameek, 11 months ago,

question — Kuroni and Cowsheds

can someone please explain the solution?

I couldnt do better than Q*N basically go L to R + dsu

• -7

 » 11 months ago, # |   +14 I couldn't solve it myself but I'll explain a solution I've heard from someone else.Throughout all the queries combined, observe that the merging of two different sets of DSU happens N-1 times at max. Now let's look at a single query [L, R]. Let's identify all the elements getting merged as a_1, a_2, ..., a_2k, in order. As we've observed, the sum of k over all queries is less than N. These elements partition the query into 2k+1 parts, where i th part is identical to the reversed 2k+2-i th part. Instead of doing a linear search, we can just use binary search + hash (using the label of DSU) to find each of these parts in O((logN)^2). Since there are O(N+Q) such parts throughout all the queries, our final complexity is O((N+Q)(logN)^2). Code.
•  » » 11 months ago, # ^ |   0 in your code why are you checking for the dsu merge as well?while(l < r && dsu.merge(l, r — 1, tr))once we get the length from where we cn start merging i.e. L + len and R — len are in same component dont we just merge till they meet?
•  » » » 11 months ago, # ^ | ← Rev. 2 →   0 It could be the case that there are multiple isolated merges separated by intervals which don't get merged. We have to repeat "(1) Binary search to find the maximal interval which don't get merged (2) Merge relevent elements" this two processes multiple times until we're done querying [L, R].