RestingRajarshi's blog

By RestingRajarshi, history, 5 years ago, In English

I was trying out implementing a incremental version of CHT with sqrt decomposition. I noticed a very wierd thing though. When I made my block size = 350, the code TLEd with >2000ms. However when i made it 351, it passed in 390ms

Here are two sets of submissions that differ in only 1 aspect, the block size.

blocksize = 350 : TLE

blocksize = 350 : TLE

blocksize = 351 : AC(389ms)

This seems very wierd to me, since blocksizes of <350 also give AC(although with more time. blocksize = 200 with AC in 1669 ms). However, its the specific value of 350 that times out.

What could be a possible for reason for this? I am just curious. (As far as i can see, my code does not have any undefined behaviour.)

  • Vote: I like it
  • +6
  • Vote: I do not like it

»
5 years ago, # |
  Vote: I like it +5 Vote: I do not like it

I didn't read the code, but if I were the author then I'd possibly make countertests for some popular values of block size, 350 being among them.

  • »
    »
    5 years ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    The problem isnt meant to be solved by sqrt decomp. Optimal soln is O(n) CHT. (slopes and query values are already sorted in the problem). I was just trying out my implementation on it. Hence I doubt there is any special case made by the author to fail a certain block-size.

    • »
      »
      »
      5 years ago, # ^ |
        Vote: I like it +13 Vote: I do not like it

      There is no problem with block-size tests. Just infinite loop in your code.

      I just see, that there is a possibility that your code will work infinite and check it.

      53469476