Sanjit-OP's blog

By Sanjit-OP, history, 12 months ago, In English

Hi, code forces, I have been getting haunted by this problem for a long time now.

Would you please help me to understand what data structure and approach are needed to solve this problem? This problem is not from any live contest.

Given an array A having N integers in the range [-1e8, 1e8] and Q queries each having 3 integers [L, R, K]. For each query, the task is to return the sum of K's smallest elements in the subarray A[L...R] where K = [1, R-L+1].

My approach:

  1. Build a merge-sort segment tree and a prefix-sum segment tree.

  2. For each query do a binary search over an integer (say V).

  3. Now count the elements in the merge-sort seg tree with values <= V and also return the prefix sum of such elements.

  4. Return a pair from step-3 {prefix_sum, C}, where prefix_sum denotes the sum of chosen elements and C denotes the count of chosen elements over binary-search value V.

  5. If C < K then search for higher value V in binary search. Otherwise, the current answer sum of K's smallest element = prefix_sum + (C-K)*V (this also handles duplicates during upper_bound search in each node).

Time complexity: Q.log(X).log(N).log(N), where X is 1e15.

  • Q = Total queries

  • log(X) = Binary search over max answer limit.

  • log(N) = Iterating over segment tree nodes

  • log(N) = Upper_bound on prefix sum on each node.

Thanks in advance :)

  • Vote: I like it
  • +17
  • Vote: I do not like it

| Write comment?
12 months ago, # |
  Vote: I like it 0 Vote: I do not like it

Auto comment: topic has been updated by Sanjit-OP (previous revision, new revision, compare).

12 months ago, # |
  Vote: I like it +1 Vote: I do not like it

Here's an alternate approach using Mo's algorithm + sqrt decomposition:

1) Coordinate compress the values of array $$$A$$$ so that they lie in the range $$$[0, n-1]$$$

2) Sort the queries based on Mo's comparator.

3) Create 2 sqrt decomposition arrays $$$cnt[blocksize]$$$ and $$$sum[blocksize]$$$, where $$$blocksize = \sqrt{n}$$$. $$$cnt$$$ would store the count of elements that lie within a given block and $$$sum$$$ would store the sum of values (sum of the original values, not the coordinate compressed ones) that lie within a given block.

4) Create an array $$$cnt2[n]$$$, where $$$cnt2_i$$$ = no.of elements with compressed value $$$i$$$ (we will update this array in the add( ) and remove( ) functions of Mo's algorithm).

5) Let's say we want to add an element with compressed value $$$x$$$ and original value $$$y$$$ in Mo's algorithm. We will do the following operations:

add 1 to $$$cnt_{x / blocksize}$$$

add y to $$$sum_{x / blocksize}$$$

add 1 to $$$cnt2_x$$$

Do the opposite in the remove( ) function

6) Now that these 3 arrays reflect the range $$$[l, r]$$$, we are going to answer the current query.

Iterate from $$$0$$$ to $$$n-1$$$, incrementing $$$i$$$ by $$$blocksize$$$. Let the count of all elements in range $$$[0, i - 1] = currcnt$$$

If $$$currcnt + cnt_{i / blocksize} < k$$$, then increment $$$currcnt$$$ by $$$cnt_{i / blocksize}$$$ and add $$$sum_{i / blocksize}$$$ to the answer.

If $$$currcnt + cnt_{i / blocksize} \geq k$$$, then the greatest element that we consider in our answer should lie within this block. Iterate from $$$i$$$ to $$$n-1$$$ until $$$currcnt \geq k$$$ and keep adding the elements that you consider to the answer.

Time complexity: $$$O((n + q)\sqrt{n})$$$

12 months ago, # |
Rev. 5   Vote: I like it +4 Vote: I do not like it

You can solve using modified Fenwick tree
This is rough code i will complete working code later , but you can definitely understand from this. this is general fenwick tree code to find top K sums.

Basically algorithm would be

for i = n to 1: 
     solve queries whose l = i 

In this the main issue was in Fenwick we can find k top sums but that would for the range [i,n)
But we need for [i,j]

So i modified Fenwick from storing point sums to store prefix sums. So when we perform bit[i] += x

We will do instead bit[i].push_back(i,x)

As we know we are pushing i in decremental order, we can binary search on node to get sum only for r <= j

So Fenwick tree update complexity would be log and query complexity would be (log n)^2 Also as we care only about order while finding top k, i compressed the values top K frequency.

net complexity: Q * (log n)^2 + nlogn

12 months ago, # |
  Vote: I like it +2 Vote: I do not like it

Persistent seg tree, online queries in O(log(max value)) time&space

resource to learn technique:

12 months ago, # |
  Vote: I like it +8 Vote: I do not like it

You can do it online in log(A) per query with wavelet trees, where A is the largest integer in the array.