### vntshh's blog

By vntshh, history, 2 years ago, , I was recently solving some problems on Linearity of Expectation. I was trying to solve the Coupon collector problem. The problem is: Given n coupons, how many coupons do you expect you need to draw with replacement before having drawn each coupon at least once? This is how I tried to solve the problem:

E[Number of coupons needed to draw so that each coupon comes atleast once] E[Number of coupons drawn till i'th coupon is drawn for the first time]

This comes from the Linearity of Expectation (Link) since independency or dependency of events does not matter.

Now, let E[Number of coupons drawn till i'th coupon is drawn for the first time] = x

This can be solved by applying the following recursive equation,

x = 1 + ((n - 1) / n) * x

On solving, we get x = n.

Thus, E[Number of coupons needed to draw so that each coupon comes atleast once]  = n 2

I know that I am wrong somewhere since the answer is n * H n, but I don't know where exactly does it go wrong. I have discussed this with my friend kr_abhinav, but we couldn't find the mistake in the proof as well.

By vntshh, 2 years ago, , Hi Codeforces!

Programming Club, IIT Indore and Euristica 2018 are proud to present our flagship event, Divide By Zero! The contest will take place on Saturday, 7th April at 9:35PM IST. Prizes : Codeforces T-shirts for top 15 participants overall and top 15 participants in India.

Thanks to the following people for making the round possible :

There are 8 problems, and 2.5 hours to solve them. The points distribution will be updated later.

Euristica, sponsored by Arcesium is the inaugral Programming festival of IIT Indore. It comprised of 10 events this year, spanning across various domains like Competitive Programming, Application Development, Cyber Security and Machine Learning. For more information, visit www.euristica.in .

Hope you guys enjoy the contest! See you on the leaderboard!

UPD: The scoring distribution is 500-1000-1500-2000-2250-2500-3000-3500.

UPD 2: Do give your feedback here : https://goo.gl/forms/xbsdMxnkA3XsG4092. Would love to hear your feedback, since that would help us get better!

We hope you guys enjoyed the contest and found the problems interesting.

UPD 3: You can find the editorial here.

Here are the list of winners who won Tshirts. We will contact you guys soon. Congrats!

Overall Top 15:

Top 15 in India: By vntshh, 3 years ago, , ## 768A. Oath of the Night's Watch

Set and Editorial by: vntshh

You just have to find the number of elements greater than the minimum number occurring in the array and less than the maximum number occurring in the array. This can be done in O(n) by traversing the array once and finding the minimum and maximum of the array, and then in another traversal, find the good numbers.

Complexity: O(n)

Code

## 768B. Code For 1

Set and Editorial by: killer_bee

It is easy to see that the total number of elements in the final list will be . The problem can be solved by locating each element in the list and checking whether it is '1' .The i th element can be located in O(logn) by using Divide and Conquer strategy. Answer is the total number of all such elements which equal '1'.

Complexity : O((r - l + 1) * logn)

Code

## 768C. Jon Snow and his Favourite Number

Set and Editorial by: vntshh

The range of strengths of any ranger at any point of time can be [0,1023]. This allows us to maintain a frequency array of the strengths of the rangers. Now, the updation of the array can be done in the following way: Make a copy of the frequency array. If the number of rangers having strength less than a strength y is even, and there are freq[y] rangers having strength y, ceil( freq[y] / 2) rangers will be updated and will have strengths y^ x, and the remaining will retain the same strength. If the number of rangers having strength less than a strength y is odd,and there are freq[y] rangers having strength y, floor( freq[y] / 2) rangers will be updated and will have strengths y^ x, and remaining will have the same strength. This operation has to be done k times, thus the overall complexity is O(1024 * k).

Complexity: O(k * 210)

Code

## 768D. Jon and Orbs

Set and Editorial by: arnabsamanta

This problem can be solve using inclusion-exclusion principle but precision errors need to be handled. Therefore, we use the following dynamic programming approach to solve this problem.

On n - th day there are two possibilities,
Case-1 : Jon doesn't find a new orb then the probability of it is .
Case-2 : Jon does find a new orb then the probability of it is .

Therefore, We need to find the minimum n such that where,
n = number of days Jon waited.
x = number of distinct orbs Jon have till now.
dp[n][x] = probability of Jon having x distinct orbs in n days.
k = Total number of distinct orbs possible.

Code

PS:The ε was added so that the any solution considering the probability in the given range passes the system tests.

## 768E. Game of Stones

Set and Editorial by: AakashHanda

This problem can be solved using DP with Bitmasks to calculate the grundy value of piles. Let us have a 2-dimensional dp table, dp[i][j], where the first dimension is for number of stones in the pile and second dimension is for bitmask. The bitmask has k-th bit set if we are allowed to remove k + 1 stones from the pile.

Now, to calculate dp[i][j] we need to iterate over all possible moves allowed and find the mex.

Code

Finally for the game, we use the grundy values stored in dp[i][2 i - 1] for a pile of size i. We take the xor of grundy values of all piles sizes. If it is 0, then Jon wins, otherwise Sam wins.

The complexity of this solution is . This will not be accepted. We can use the following optimizations for this problem:

1. Use a top-down approach to calculate the values of dp[i][2 i - 1], hence calculating only those values that are required.
2. Since we can only remove at most i stones from a pile, as stated above, we need to store values from j in range [0, 2 i - 1].

So we can rewrite the above code to incorporate these change. Hence, the final solution is as follows

Code

Bonus: Try to find and prove the O(1) formula for grundy values

## 768F. Barrels and boxes

Set and Editorial by: arnabsamanta, apoorv_kulsh

Every arrangement of stacks can expressed in the form of linear arrangement. In this linear arrangement, every contiguous segment of wine barrels are separated by food boxes. For the arrangement to be liked by Jon each of the f + 1 partitions created by f food boxes must contain either 0 or greater than h wine barrels.

Let u out of f + 1 partitions have non-zero wine barrels then the remaining r + 1 - u partitions must have 0 wine barrels..

Total number of arrangements with exactly u stacks of wine barrels are  is the number of ways of choosing u partitions out of f + 1 partitions.
X is the number of ways to place w wine barrels in these u partitions which is equal to the coefficient of x w in {x h + 1·(1 + x + ...)} u. Finally we sum it up for all u from 1 to f + 1.

So the time complexity becomes O(w) with pre-processing of factorials.

w = 0 was the corner case for which the answer was 1.
We did not anticipate it will cause so much trouble. Not placing it in the pretests was a rookie mistake.

Code

## 768G. The Winds of Winter

Set and Editorial by: apoorv_kulsh

We are given a tree. We remove one node from this tree to form a forest. Strength of forest is defined as the size of largest tree in forest. We need to minimize the strength by changing the parent of atmost one node to some other node such that number of components remain same.

To find the minimum value of strength we do a binary search on strength. It is possible to attain Strength S if
1. There is less than one component with size greater than S.
2. There exists a node in maximum component with subtree size Y such that,
M - Y ≤ S (Here M is size of maximum component and m is size of minimum component)
m + Y ≤ S.
Then we can change the parent of this node to some node in smallest component.

The problem now is to store Subtree_size of all nodes in the maximum component and perform binary search on them. We can use Stl Map for this.
Let X be the node which is removed and X size be its subtree size There are two cases now -:
1. When max size tree is child of X.
2. When max size tree is the tree which remains when we remove subtree of X from original tree.(We refer to this as outer tree of X).

In the second case the subtree sizes of nodes on path from root to X will change when X is removed. Infact their subtree size will decrease exactly by X size. While performing binary search on these nodes there will be an offset of X size. So we store them seperately. Now we need to maintain 3 Maps,
where
map children : Stores the Subtree_size of all nodes present in heaviest child of X.
map parent : Stores the Subtree_size of all nodes on the path from root to X.
map outer : Stores the Subtree_size of all nodes in outer tree which are not on path from root to X.

Go through this blog post before reading further (http://codeforces.com/blog/entry/44351).

Maintaining the Maps
map children and map parent are initialised to empty while map outer contains Subtree_size of all nodes in the tree.

map parent : This can be easily maintained by inserting the Subtree_size of node in map when we enter a node in dfs and remove it when we exit it.
map children : map children can be maintained by using the dsu on tree trick mentioned in blogpost.
map outer : When we insert a node's subtree_size in map childern or map parent we can remove the same from map outer and similarly when we insert a node's Subtree_size in map childern or map parent we can remove the same from map outer.

Refer to HLD style implementation in blogpost for easiest way of maintaining map outer.
Refer to code below for exact point of insertions and deletions into above mentioned 3 maps.

Code

Complexity O(NlogN 2)  