r3novatio's blog

By r3novatio, 3 years ago, In English
Summary

The blog is made to discuss these two things:

  1. Other ways to solve the mentioned problem (without the need of knowing the data structures mentioned in the editorial)
  2. The issue of MLE and whether it depends on current memory used or the combined total.
Detail

Problems with mandatory use of Sparse Tables or Segment Trees generally don't show up so early and even if they do, they have much less solves if they're purely based on the knowledge of Sparse Tables or Segment Trees.

I wonder what ways could there be to solve 1549D - Integers Have Friends without the pre-requisite knowledge of these advanced data structures. Can anyone propose other ways which they might have tried?

One approach that I tried using divide and conquer was as follows (it's to solve the longest subarray with GCD > 1 problem, which is equivalent):

My Approach

I came up with this implementation:

C++ Code

Unfortunately, the code is throwing an unexpected Memory Limit Exceeded error (124716291). I'm not sure if it's due to the combined memory usage exceeding the permitted limits, or whether at some point during the execution, the code is trying to take up more memory than allowed.

I am aware that as the scope ends, C++ calls destructors for STL objects, yet I even tried to clear both maps manually every time their use finished.

Is there any need for explicit freeing of memory beyond what has been done? I always thought MLEs occur when the instantaneous (and not the accumulated) memory allocation goes beyong limit at some point.

Edit:

The MLE issue got resolved. Had missed the case where array size is 0 (which is possible because it's the difference array of the input)

  • Vote: I like it
  • +19
  • Vote: I do not like it

»
3 years ago, # |
  Vote: I like it 0 Vote: I do not like it

peti1234 shared his approach here

  • »
    »
    3 years ago, # ^ |
      Vote: I like it +3 Vote: I do not like it

    Somehow peti1234 went missing with all the branch. Weird things happen here.

»
3 years ago, # |
  Vote: I like it 0 Vote: I do not like it

for n==1 you're subtracting 1 from n for the difference array's limit, then subsequently sending 0, -1 to calc which misses your l==r test.

  • »
    »
    3 years ago, # ^ |
      Vote: I like it +3 Vote: I do not like it

    Oh. Yes it passed, thanks.

    What's with the MLE though?

    • »
      »
      »
      3 years ago, # ^ |
      Rev. 2   Vote: I like it +1 Vote: I do not like it

      infinite recursion from the second nested call, the -1 that was in r gets reused indefinitely.

»
3 years ago, # |
  Vote: I like it +9 Vote: I do not like it

There's also another technique to solve this problem: We can make a queue that supports pushback, popfront and GCD of the whole queue, all in O(log(A)) amortized. This can be done by simulating a queue with two stacks and keeping track of the GCD in the stack. With this data structure we can do a very simple two pointers to calculate the answer, to get a time complexity of O(n log(A)). A full explanation is here under "Segment with small spread".

»
3 years ago, # |
  Vote: I like it 0 Vote: I do not like it

For a fixed right endpoint $$$i$$$, if $$$gcd(A_j..A_i)=gcd(A_k..A_i)$$$ and $$$j<k$$$ then $$$k$$$ is no longer useful when $$$i$$$ is increasing. Using that observation, we can maintain the list of at most $$$log(A)$$$ possible left endpoints and keep updating it.

My code: 124546247 (the sort is unnecessary as the GCD values are already sorted).

»
3 years ago, # |
Rev. 3   Vote: I like it 0 Vote: I do not like it

I believe that there's a typo on your algorithm definition for the right half. It should be: mapR[GCD(Amid+1,...Aj)]=j.

Legit question: why is it that |map| ≤ log(10^18)?

  • »
    »
    3 years ago, # ^ |
      Vote: I like it +9 Vote: I do not like it

    Ah, subtle point. Thanks!

    why is it that |map| ≤ log(10^18)?

    $$$|map|$$$ represents unique values that $$$GCD[A_i,...,A_{mid}]$$$ can have. The very first value to be inserted is of $$$A_{mid}$$$ and every next unique GCD value would have to be at least $$$2$$$ times smaller.

»
3 years ago, # |
  Vote: I like it 0 Vote: I do not like it

You can also modify the $$$O(n^2)$$$ solution by checking if you have iterated over a position with some gcd value before. If you have, you can skip that iteration and proceed to the next one.

I did that with sets, but I think it can be improved by keeping track of the positions where the gcd changes