beginner1010's blog

By beginner1010, history, 4 weeks ago, In English,

Today, I tried F2. Same Sum Blocks (Hard) from Codeforces Round #547 (Div. 3). Since $$$n$$$ can be as large as $$$1500$$$, I was struggling to come up with an $$$O(n^2)$$$ but I failed.

When I read the editorial, surprisingly, I found out that the writer's solution is $$$O(n^3)$$$. They used a Hashmap to speedup the whole process, and it seems that the number of different summation values of consecutive elements will be less than $$$n^2$$$. Though I cannot find any counterexample such that different sub-array sums can be as large as $$$n^2$$$ (or at least as large as $$$\frac{n \times (n - 1)}{2}$$$), I cannot convince myself an $$$O(n^3)$$$ approach can pass the system test.

I was wondering if you have any analysis to demonstrate that the optimization (using Hashmap) is sufficient, or if you know any bound on the number of different summation of consecutive elements in an arbitrary array.

 
 
 
 
  • Vote: I like it
  • 0
  • Vote: I do not like it

»
4 weeks ago, # |
  Vote: I like it 0 Vote: I do not like it

Auto comment: topic has been updated by beginner1010 (previous revision, new revision, compare).

»
4 weeks ago, # |
  Vote: I like it +5 Vote: I do not like it

there are only $$$\frac{n\cdot(n-1)}{2}$$$ non empty subarrays (those can be described by $$$0\leq l < r < n$$$ where the subarray goes fro $$$[l, r)$$$). So there are obviously not more than $$$n^2$$$ differen sums?

and there can be up to $$$\frac{n\cdot(n-1)}{2}$$$ different sums. The array $$$1,2,4,8,\dots,2^n$$$ has this property.

  • »
    »
    4 weeks ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    Yes. Definitely, there cannot be more than $$$\frac{n (n - 1)}{2}$$$. $$$1, 2, 4, 8, ..., 2^n$$$ is a nice counterexample. However, the input cannot get an $$$n$$$ larger than $$$63$$$, otherwise overflow.

    Surprisingly, when I was implementing $$$O(n^3)$$$, I came up with an $$$O(n^2)$$$ solution. Here is my solution: For each $$$sum$$$, we need to keep the right bounds of ranges.

    To be more specific, for each $$$sum$$$, we have two options:

    1. add the current interval $$$[l,r]$$$ if it does not intersect with the previous range (the very recent one).

    2. If the current range intersects with the previous one, choose the range whose bound $$$r$$$ is lower.

    It can be seen that this (greedy) approach is optimal. Hence, we get $$$O(1)$$$ update time for each $$$sum$$$ (assuming Hashmap does the update in amortized $$$O(1)$$$).

    My submissions: 59498958 $$$O(n^2)$$$ time and $$$O(n^3)$$$ space complexity. 59499181 $$$O(n^2)$$$ time and $$$O(n^2)$$$ space complexity.

    • »
      »
      »
      4 weeks ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      Space complexity can't be higher than time complexity.

      • »
        »
        »
        »
        4 weeks ago, # ^ |
          Vote: I like it 0 Vote: I do not like it

        Simple allocations on stack are $$$O(1)$$$ of time for any size of memory, so actually it can.

»
4 weeks ago, # |
  Vote: I like it +11 Vote: I do not like it

Here is a way to get $$$O(n^2)$$$ sums: $$$a_i = 1$$$ if $$$i < n/2$$$, $$$a_i = n/2+1$$$ otherwise.

»
4 weeks ago, # |
  Vote: I like it +3 Vote: I do not like it

Why do you think the solution from editorial is $$$O(n^3)$$$?

Looks like you believe that iterating over the possible sums is $$$O(n^2)$$$ and the iterating over segments is $$$O(n)$$$, so the total is $$$O(n^3)$$$. But this is $$$O(n^2)$$$: you'll look at every segment only once and there are $$$O(n^2)$$$ of them.