lucyanna2018's blog

By lucyanna2018, history, 4 months ago, In English,
 
 
 
 
  • Vote: I like it  
  • +57
  • Vote: I do not like it  

»
4 months ago, # |
Rev. 3   Vote: I like it +21 Vote: I do not like it

Actually, this method can also find the prefix sum of non-multiplicative functions.

For example, let , (pi ≤ pi + 1), f(n) = pm - 1 or 0 if m ≤ 1. The sum of f(n) can also be calculated using this method.

More problems to practice:

  • Sanrd -- the link of above problem.
  • Simple Function -- if , then , where means bitwise xor.
  • The Sum of Unitary Totient -- you need to optimize very hard to solve this problem. You can also use some precomputation with segmented sieve to solve this problem, the source limited is 32 KB.
»
4 months ago, # |
  Vote: I like it +20 Vote: I do not like it

While the approach is very nice, I don't understand the claim that the complexity is O(n2 / 3). Let me copy a simplified version of the last code snippet here:

for p = 2, 3, 5, ... (all primes <= sqrt(n))
  for x in [n, n / 2, n / 3, ..., n / (sqrt(n) - 1), sqrt(n), sqrt(n) - 1, ..., 1]
    if (x < p^2) break

Let's consider all primes p which are less than n1 / 4. According to the prime numbers distribution theorem, there would be about primes. For each of them the inner loop would at least check the first numbers, since all of them are greater than . So, the complexity of the described algorithm is at least .

  • »
    »
    4 months ago, # ^ |
    Rev. 2   Vote: I like it 0 Vote: I do not like it

    Actually this is the complexity analyse method of 'old' algorithm 'discussed about 2-3 years ago in Chinese NOI Winter Camp'. In my opinion, this 'new' approach is only a simplified version of the 'old' approach. Its complexity is still .

    Because its constant factor is really small, I think the author claims its complexity as O(n2 / 3) only by its running time. :/

    • »
      »
      »
      4 months ago, # ^ |
      Rev. 2   Vote: I like it 0 Vote: I do not like it

      Please read the new version of my blog. I'm not sure if you can prove the complexity is O(n^(3/4)/logn) for any "magic >= sqrt(n)". Maybe the framework is the same as the old approach, but I don't think it's just a "simplification".

      • »
        »
        »
        »
        4 months ago, # ^ |
        Rev. 2   Vote: I like it 0 Vote: I do not like it

        The bigger 'magic' is, the higher the complexity of the first part is. When , the complexity of the first part is , so the complexity can't be lower than that.

        Though, I think if using the method described in http://codeforces.com/blog/entry/44466 for the first part, the complexity might be improved.

        • »
          »
          »
          »
          »
          4 months ago, # ^ |
          Rev. 2   Vote: I like it 0 Vote: I do not like it

          I have take that blog into consideration (when calculating the complexity). But in practise the more operations the slower the whole program.

          Edit: I've found the original paper. I compared the original method and my method, it seems that there's a difference between that method and my method: the original method only considers "p > sqrt(n)" while my method is not. I don't know if this will affect the proof of complexity (in the original paper).

»
4 months ago, # |
Rev. 2   Vote: I like it +13 Vote: I do not like it

http://codeforces.com/blog/entry/44466 problem F

Choosing optimal , we obtain the complexity .

»
4 months ago, # |
  Vote: I like it 0 Vote: I do not like it

Judging by complexity only, is it the same thing? link

»
4 months ago, # |
Rev. 3   Vote: I like it +10 Vote: I do not like it

No, I am completely wrong.