Блог пользователя beginner1010

Автор beginner1010, история, 5 лет назад, По-английски

Today, I tried F2. Same Sum Blocks (Hard) from Codeforces Round #547 (Div. 3). Since $$$n$$$ can be as large as $$$1500$$$, I was struggling to come up with an $$$O(n^2)$$$ but I failed.

When I read the editorial, surprisingly, I found out that the writer's solution is $$$O(n^3)$$$. They used a Hashmap to speedup the whole process, and it seems that the number of different summation values of consecutive elements will be less than $$$n^2$$$. Though I cannot find any counterexample such that different sub-array sums can be as large as $$$n^2$$$ (or at least as large as $$$\frac{n \times (n - 1)}{2}$$$), I cannot convince myself an $$$O(n^3)$$$ approach can pass the system test.

I was wondering if you have any analysis to demonstrate that the optimization (using Hashmap) is sufficient, or if you know any bound on the number of different summation of consecutive elements in an arbitrary array.

  • Проголосовать: нравится
  • 0
  • Проголосовать: не нравится

»
5 лет назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Auto comment: topic has been updated by beginner1010 (previous revision, new revision, compare).

»
5 лет назад, # |
  Проголосовать: нравится +5 Проголосовать: не нравится

there are only $$$\frac{n\cdot(n-1)}{2}$$$ non empty subarrays (those can be described by $$$0\leq l < r < n$$$ where the subarray goes fro $$$[l, r)$$$). So there are obviously not more than $$$n^2$$$ differen sums?

and there can be up to $$$\frac{n\cdot(n-1)}{2}$$$ different sums. The array $$$1,2,4,8,\dots,2^n$$$ has this property.

  • »
    »
    5 лет назад, # ^ |
      Проголосовать: нравится 0 Проголосовать: не нравится

    Yes. Definitely, there cannot be more than $$$\frac{n (n - 1)}{2}$$$. $$$1, 2, 4, 8, ..., 2^n$$$ is a nice counterexample. However, the input cannot get an $$$n$$$ larger than $$$63$$$, otherwise overflow.

    Surprisingly, when I was implementing $$$O(n^3)$$$, I came up with an $$$O(n^2)$$$ solution. Here is my solution: For each $$$sum$$$, we need to keep the right bounds of ranges.

    To be more specific, for each $$$sum$$$, we have two options:

    1. add the current interval $$$[l,r]$$$ if it does not intersect with the previous range (the very recent one).

    2. If the current range intersects with the previous one, choose the range whose bound $$$r$$$ is lower.

    It can be seen that this (greedy) approach is optimal. Hence, we get $$$O(1)$$$ update time for each $$$sum$$$ (assuming Hashmap does the update in amortized $$$O(1)$$$).

    My submissions: 59498958 $$$O(n^2)$$$ time and $$$O(n^3)$$$ space complexity. 59499181 $$$O(n^2)$$$ time and $$$O(n^2)$$$ space complexity.

»
5 лет назад, # |
  Проголосовать: нравится +11 Проголосовать: не нравится

Here is a way to get $$$O(n^2)$$$ sums: $$$a_i = 1$$$ if $$$i < n/2$$$, $$$a_i = n/2+1$$$ otherwise.

»
5 лет назад, # |
  Проголосовать: нравится +3 Проголосовать: не нравится

Why do you think the solution from editorial is $$$O(n^3)$$$?

Looks like you believe that iterating over the possible sums is $$$O(n^2)$$$ and the iterating over segments is $$$O(n)$$$, so the total is $$$O(n^3)$$$. But this is $$$O(n^2)$$$: you'll look at every segment only once and there are $$$O(n^2)$$$ of them.