This Sunday, May 3-rd, 19:00 Moscow Time there will be Round 3 of VK Cup 2015 Championship!

As in Round 2, there will also be an online mirror that is rated round available only for Div1-contestants. There will be 6 task in random order and a smooth dynamic scoring system.

Round was brought to you by Codeforces team, VK team and yeputons. As usual, we want to thank winger and AlexFetisov for a great testing help.

Top-50 participants of online mirror will get a nice VK Cup T-Shirt!

Good luck and have fun!

**UPD1** The round is over! Congratulations to all contestants in top-50, you will get a nice VK Cup 2015 Championship T-Shirt soon! There will be an editorial soon, stay tuned...

**UPD2** Finally, the editorial is ready!

Not rated for div2? (Why negative upvotes?)

u can't compete. VK Cup 2015 Round 3 (unofficial online mirror, Div. 1 only)

I think the opposite of "upvotes" is "downvotes" instead of "negative upvotes"...

That helped.

This contest is so

`hard`

with me.I think I don't have enough knowledge to be

`Div1-contestants`

.I hope I will become

`Expert`

to improve myself.Hard contest!

I got too late to participate.. but Was problem C a binary search , bcz for all x >= k , idempotent condition is true ?

I think it wasn't. I've tried binary search.

You can't use binary search — for any cycle in given graph, answer should be divisible by length of this cycle.

3 2 3 1

k=3 works k=4 does not

I failed to solve this problem, but I'm pretty sure your condition

x≥kdoes not hold.Consider a cycle 1 -> 2 -> 3 -> 1; for this example,

k= 3. Settingk= 4:f^{4}(1) = 2,f^{4}(2) = 3,f^{4}(3) = 1.f^{8}(1) = 3,f^{8}(2) = 1,f^{8}(3) = 2.What needs to be solved turns out to be a system of congruences, I think.

It's not binary search. Because if F_i(x) is not idempotent this doesn't mean that F_t(x) is not idempotent for all t less than i. For example in the third sample F_4 is not idempotent while F_3 is.

How to solve

`Problem C`

?In this problem, I try the value of

`k`

from`1`

to`100000`

(I think it can run in`1s`

),but it is

`Wrong Anser on test 9`

.Can you give me effective hint?

Hint: k can be larger than 100000, or 10^9 (I think).

Hint: Maximum possible answer is around 10^14

I wonder how did you find the maximum possible answer. May I ask you?

In the graph with edges (i->f(i)) if we have a cycle of length K, then answer should be divisible by K (such that when we apply f^K(x), we get x, and f^K(f^K(x)) = x). Therefore, an obvious lower bound for the answer would be a set of cycles of different lengths. Answer would be their LCM. To maximize LCM, let's take prime numbers. We get something like 2 + 3 + 5 + ... + 37 or something, such that sum is less than 200 and product is maximized. This is around 10^14.

And the true maximum possible answer is a division of 200 into a groups such that LCM of the sizes of these groups is maximized. This division includes not only primes but maybe some powers of primes. According to the editorial, there is a sequence which describes such maximum LCM and it is about 2 * 10^14 for 200, therefore my lower bound was pretty close.

The maximum test is actually a permutation consisting of 10 cycles of lengths (16, 9, 25, 7, 11, 13, 19, 23, 29, 31).

I tried 1 to 2500000! and got runtime error 8 times

I think answer of test 9 is bigger than 2500000.

The gist of the problem is to tranform it into one about graphs. We construct a graph with edges (

i,f(i)), we see that f(x) is just the neighbor of node x, andf^{k}(x) is the kth neighbor of node x. Now because every node has an outdegree of exactly one this graph can be decomposed into cycles or paths that lead to cycles. Now the conditionf^{k}(f^{k}(x)) =f^{k}(x) can be interpreted as the 2*kth neighbor of node x should equal it's kth neighbor. For nodes that are part of a cycle the answer is the a multiple of the length of the cycle(because k must satisfy the modular equationk≡ 2 *k(modlength) which is equivalent tok≡ 0(modlength)). So if there would be only cycles the answer would be the LCM of all the lengths of these cycles. Now what about paths? We notice that we need to pick a k such that it reaches a cycle, then for anyK≥kwe will just be walking through our cycle, so just take the length of the longest path to a cycle then pick a constantcsuch thatc*lcm_of_cycles≥longest_path.Perfect!!! The best explanation. This is better than editorial.

UPD: why me comment have more "likes" than comment above? =)

Will there be editorial?

The contest

`Round 1`

and`Round 2`

had the`editorial`

, so I can pridict : will have the editorial.I also hope it will become the trust, because I can't solve anything in 6 problems

yes... and there was the system test

With 5 minutes to go I asked myself, what was I thinking? Faced with my strength, a graph problem, I instead tried to solve a maths question?? (I suck at maths questions :P)

Oh well, I thought, as long as A passes I'll get a shirt.

So you could imagine my disappointment when it failed TT

So you could imagine my shock at seeing my final rank: 50 !!!!!

Solved 2 problems in 33mins, it's currently 4:30am and I could have slept 2 hours ago. But who cares I get a VK shirt :D :D

Looks like I'm using fast input from now on...

http://codeforces.com/contest/542/submission/10988472

http://codeforces.com/contest/542/submission/10989738

Looks more like a random fluctuation in run time.

I'd also use less copy-paste if I were you :P

It's like a joke. Unfortunately for you.

Had the same issue... Trusted cin/cout too much; should've used fast input method written on my template :'(

http://codeforces.com/contest/542/submission/10989470

http://codeforces.com/contest/542/submission/10989944

For C, how to prove that the answer will fit in long long ?

I just wrote up a quick python script to see what max answer could be. Basically, you're looking for a bunch of numbers that sum to <= 200, and have maximal LCM. So just take the first primes with sum <= 200 and find their product.

Edit: This clearly isn't actually correct, see Zlobober's comment below.

thanks, at first I didn't realize that maximizing the LCM is by taking the first primes , but anyway I used this nice prove during the contest: the answer to all previous problems on codeforces fit in long long so most probably this will fit too :D

It's not enough: for small primes it's better to take p^k rather than p.

Yeah, of course you're right.

TBH I was more just giving myself a little piece of mind and then just trusting that there wouldn't be a question that'd overflow long longs :P

Maximum orders of permutations of

nelements.Nice one ,thanks.

`int dp[N][T]`

instead of`int dp[T][N]`

`=>`

I simply lost the t-shirt.How to solve problem F? Is it some sort of 0-1 knapsack?

I reduced it to knapsack with all weights as powers of 2 which can be solved greedily.

Yeah I did the same reduction but couldn't figure out how to make the knapsack run in time.. Can you elaborate on "greedily"?

If you want to take something with weight 1, you always want to take a second if it exists. So I took the two most valuable and merged them into an item with weight 2, and continued doing this for all weights until everything had a weight equal to 2^T

Could you explain the proof to your solution

Think like this. At any stage, let minimum time for a node be x. Now two cases —

1) There is only one node with time x If this node is combined with another, the net time will be > x+1. (Since x is min time) Hence we just increase the time of this node by 1 and proceed.

2) There are >=2 nodes with time x. Combine the 2 most valuable nodes with time x.

See my solution for implementation.

Another approach:

dp[height][free]: maximum total interest value of the tasks in tree when we use tasks withT-t[i] ≤heightand we have at leastfreefree leaves.10989745

I had the same approach. How did you keep track of the number of tasks used at the same height? I had to keep my state as dp[pos][free] : where pos was the index of the task i was currently processing. I had the tasks initially sorted by (T-t[i]);

I grouped tasks with respect of

T-t[i] and in each group, kept the tasks sorted by their interest values. And i had to store partial sum for each group in order to update states.I used a simple approach. We will split tasks by time. Every minute form a level. Start from the lowest level that contains at least one task. Combine two tasks in this level with the maximum interest. Add the result as a task in next level. If there is only one add it to the next level. If the level is empty precede to the next level. If you reached the highest level (T) the answer is the maximum interest in this level.

Here is my implementation => 10989241

As far as I understand, you combine current level tasks into pairs not only once but as long as there are enough tasks to form at least one pair. Is that right?

Hard contest + Small number of participants => low rating change.

Me with 1 solved problem have a close rank with someone who didn't solve any.

I suggest count someone participated in contest if he/she see a problem not if he/she tried to solve any.

Better is to always give a simple problem, so that at least every one has a submission.

Still, one person can choose not to solve the easiest problem when he doesn't know how to solve the other ones, in order to avoid decreasing his rating.

Yeah he can. But most people won't as is evident by Div 1 contests where many persons end up solving just 1 problem.

Was anyone able to solve problem B in

O(n^{2}) at least?I think so. Let's sort ducks by their tails positions. For each duck, let's calculate the answer ignoring ducks with tails behind this duck's tail. Assume that this duck ends at

t. I claim that we will ignore this duck, or shoot at non empty prefix of sequence:t,t-r,t- 2r, ... We can check all meaningful prefixes by iterating over previous ducks.So, are you doing some kind of DP? What about big coordinates? This sequence can be pretty long.

I check two possibilities: 1. There will be no other shots (iterate over all ducks to check this) 2. I will shoot at some previous duck's tail, so I shoot as many as I can, while still being able to shoot that duck's tail. (iterate over all ducks in reverse order).

Edit: Now I think I should also consider the sequence in the other direction

t,t+r,t+ 2r, ...Is there a nice way to solve problem D?

My solution (which I debugged 2 minutes after contest) goes as follows: We have to count number of representations of the form

A= (1 +p_{1}^{a1})... (1 +p_{k}^{ak})We call a prime

p<sqrt(A) bad if there is aksuch that 1 +p^{k}|A.We use

dp[divisorsOfA][badPrimes].dp[d][k] is number of representations ofdas product of 1 +p_{i}^{ai}wherep_{i}is one of the firstkbad primes.One can easily write formula for

dp[d][k].Answer is

dp[A][nrBadPrimes].Number of divisors of A is O(sqrt(A)). But what is the number of Bad Primes in worst case? ( I think your dp array size is O(A) ) :)

well in fact max number of divisors is less than 7000 and number of bad primes for "very composite" numbers was less than 1000.

Thank you very much XD. I have this idea but I think that dp size is O(A) :(

Hey, can someone who solved D take a look at this submission — http://codeforces.com/contest/542/submission/10991347

I seem to be doing the same DP as in reference solution, but it's working around 3 seconds on the worst test case, and I don't know how to bring it down. During the contest I had the same solution with a map for storing dp states, but switching to unordered_map didn't help.

Specifically, can you take a look at the go() function. I left the comments on top of it to explain stuff. When the first call is made, I still have around 1.3 seconds to figure out the answer, but DP with 14M ops is working much slower. Let me know if you figure out the problem.

Thanks!

The tons of operations on maps are causing you to lose insane amounts of time in memory allocation. Just get rid of them and use static arrays instead: 10992963

And you can even reorder the dp dimensions for extra cache-friendly brownie points: 10992968

Ah, that's unfortunate ...

Thanks man!

I attempted A first because I'm familiar with data structure problems. So I solved other problems very late. One step of my algorithm in A is to iterate the ads in decreasing order of their ending time. So I wrote:

Looks alright. Huh? But the bad thing was that I iterated the ads from n — 1 to 0 instead of from 0 to n — 1. This is why I lost my T-shirt. And luckily now I have the same rating as dreamoon_love_AA.

In problem E, is there a better solution than

ntimesBFS?I mean solution like this:

1) check bipartity (our graph have to be bipartite)

2) In every connected component find longest path

3) The answer is sum of longest paths in every component

Still waiting for editorial...

This is editorial, but it is not translated.

The translated version of editorial appeared. Sorry for the delay!

Thanks a lot.

My first time to become an IGM. Yay

Has anyone received the T-shirt yet?