# | User | Rating |
---|---|---|

1 | tourist | 3496 |

2 | Um_nik | 3337 |

3 | Petr | 3298 |

4 | W4yneb0t | 3218 |

5 | Radewoosh | 3216 |

6 | OO0OOO00O0OOO0O0…O | 3188 |

7 | Syloviaely | 3168 |

8 | izrak | 3109 |

9 | anta | 3106 |

10 | fateice | 3099 |

# | User | Contrib. |
---|---|---|

1 | tourist | 184 |

2 | rng_58 | 170 |

3 | csacademy | 165 |

4 | Petr | 158 |

5 | Swistakk | 154 |

6 | lewin | 149 |

7 | Errichto | 147 |

8 | matthew99 | 145 |

9 | Zlobober | 141 |

9 | adamant | 141 |

Codeforces (c) Copyright 2010-2018 Mike Mirzayanov

The only programming contests Web 2.0 platform

Server time: Feb/20/2018 11:01:06 (d2).

Desktop version, switch to mobile version.

User lists

Name |
---|

Actually, this method can also find the prefix sum of non-multiplicative functions.

For example, let , (

p_{i}≤p_{i + 1}),f(n) =p_{m - 1}or 0 ifm≤ 1. The sum off(n) can also be calculated using this method.More problems to practice:

While the approach is very nice, I don't understand the claim that the complexity is

O(n^{2 / 3}). Let me copy a simplified version of the last code snippet here:Let's consider all primes

pwhich are less thann^{1 / 4}. According to the prime numbers distribution theorem, there would be about primes. For each of them the inner loop would at least check the first numbers, since all of them are greater than . So, the complexity of the described algorithm isat least.Actually this is the complexity analyse method of 'old' algorithm 'discussed about 2-3 years ago in Chinese NOI Winter Camp'. In my opinion, this 'new' approach is only a simplified version of the 'old' approach. Its complexity is still .

Because its constant factor is really small, I think the author claims its complexity as

O(n^{2 / 3}) only by its running time. :/Please read the new version of my blog. I'm not sure if you can prove the complexity is O(n^(3/4)/logn) for any "magic >= sqrt(n)". Maybe the framework is the same as the old approach, but I don't think it's just a "simplification".

The bigger 'magic' is, the higher the complexity of the first part is. When , the complexity of the first part is , so the complexity can't be lower than that.

Though, I think if using the method described in http://codeforces.com/blog/entry/44466 for the first part, the complexity might be improved.

I have take that blog into consideration (when calculating the complexity). But in practise the more operations the slower the whole program.

Edit: I've found the original paper. I compared the original method and my method, it seems that there's a difference between that method and my method: the original method only considers "p > sqrt(n)" while my method is not. I don't know if this will affect the proof of complexity (in the original paper).

http://codeforces.com/blog/entry/44466 problem F

Judging by complexity only, is it the same thing? link

No, I am completely wrong.