Блог пользователя Sanjit-OP

Автор Sanjit-OP, история, 23 месяца назад, По-английски

Hi, code forces, I have been getting haunted by this problem for a long time now.

Would you please help me to understand what data structure and approach are needed to solve this problem? This problem is not from any live contest.

Given an array A having N integers in the range [-1e8, 1e8] and Q queries each having 3 integers [L, R, K]. For each query, the task is to return the sum of K's smallest elements in the subarray A[L...R] where K = [1, R-L+1].

My approach:

  1. Build a merge-sort segment tree and a prefix-sum segment tree.

  2. For each query do a binary search over an integer (say V).

  3. Now count the elements in the merge-sort seg tree with values <= V and also return the prefix sum of such elements.

  4. Return a pair from step-3 {prefix_sum, C}, where prefix_sum denotes the sum of chosen elements and C denotes the count of chosen elements over binary-search value V.

  5. If C < K then search for higher value V in binary search. Otherwise, the current answer sum of K's smallest element = prefix_sum + (C-K)*V (this also handles duplicates during upper_bound search in each node).

Time complexity: Q.log(X).log(N).log(N), where X is 1e15.

  • Q = Total queries

  • log(X) = Binary search over max answer limit.

  • log(N) = Iterating over segment tree nodes

  • log(N) = Upper_bound on prefix sum on each node.

Thanks in advance :)

  • Проголосовать: нравится
  • +17
  • Проголосовать: не нравится

»
23 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Auto comment: topic has been updated by Sanjit-OP (previous revision, new revision, compare).

»
23 месяца назад, # |
  Проголосовать: нравится +1 Проголосовать: не нравится

Here's an alternate approach using Mo's algorithm + sqrt decomposition:

1) Coordinate compress the values of array $$$A$$$ so that they lie in the range $$$[0, n-1]$$$

2) Sort the queries based on Mo's comparator.

3) Create 2 sqrt decomposition arrays $$$cnt[blocksize]$$$ and $$$sum[blocksize]$$$, where $$$blocksize = \sqrt{n}$$$. $$$cnt$$$ would store the count of elements that lie within a given block and $$$sum$$$ would store the sum of values (sum of the original values, not the coordinate compressed ones) that lie within a given block.

4) Create an array $$$cnt2[n]$$$, where $$$cnt2_i$$$ = no.of elements with compressed value $$$i$$$ (we will update this array in the add( ) and remove( ) functions of Mo's algorithm).

5) Let's say we want to add an element with compressed value $$$x$$$ and original value $$$y$$$ in Mo's algorithm. We will do the following operations:

add 1 to $$$cnt_{x / blocksize}$$$

add y to $$$sum_{x / blocksize}$$$

add 1 to $$$cnt2_x$$$

Do the opposite in the remove( ) function

6) Now that these 3 arrays reflect the range $$$[l, r]$$$, we are going to answer the current query.

Iterate from $$$0$$$ to $$$n-1$$$, incrementing $$$i$$$ by $$$blocksize$$$. Let the count of all elements in range $$$[0, i - 1] = currcnt$$$

If $$$currcnt + cnt_{i / blocksize} < k$$$, then increment $$$currcnt$$$ by $$$cnt_{i / blocksize}$$$ and add $$$sum_{i / blocksize}$$$ to the answer.

If $$$currcnt + cnt_{i / blocksize} \geq k$$$, then the greatest element that we consider in our answer should lie within this block. Iterate from $$$i$$$ to $$$n-1$$$ until $$$currcnt \geq k$$$ and keep adding the elements that you consider to the answer.

Time complexity: $$$O((n + q)\sqrt{n})$$$

»
23 месяца назад, # |
Rev. 5   Проголосовать: нравится +4 Проголосовать: не нравится

You can solve using modified Fenwick tree
https://ideone.com/GjlHbl
This is rough code i will complete working code later , but you can definitely understand from this.


https://ideone.com/5x16ud this is general fenwick tree code to find top K sums.

Basically algorithm would be

for i = n to 1: 
     add_fenwick(a_i)
     solve queries whose l = i 

In this the main issue was in Fenwick we can find k top sums but that would for the range [i,n)
But we need for [i,j]

So i modified Fenwick from storing point sums to store prefix sums. So when we perform bit[i] += x

We will do instead bit[i].push_back(i,x)

As we know we are pushing i in decremental order, we can binary search on node to get sum only for r <= j

So Fenwick tree update complexity would be log and query complexity would be (log n)^2 Also as we care only about order while finding top k, i compressed the values before.like top K frequency.

net complexity: Q * (log n)^2 + nlogn

»
23 месяца назад, # |
  Проголосовать: нравится +2 Проголосовать: не нравится

Persistent seg tree, online queries in O(log(max value)) time&space https://ideone.com/yjwDSV

resource to learn technique: https://cp-algorithms.com/data_structures/segment_tree.html#finding-the-k-th-smallest-number-in-a-range

»
23 месяца назад, # |
  Проголосовать: нравится +8 Проголосовать: не нравится

You can do it online in log(A) per query with wavelet trees, where A is the largest integer in the array.

code

»
3 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

You can get O((n+q)logn) with merge sort tree and fractional cascading after doing value compression. Another solution is to make a persistent segment tree or a wavelet tree

»
3 месяца назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Use a chairman tree(persistent segment tree) can solve this problem in $$$O(n \log n)$$$ online, but with a $$$O(n \log n)$$$ 's memory.