Vladosiya's blog

By Vladosiya, history, 14 months ago, translation, In English

1790A - Polycarp and the Day of Pi

Idea: MikeMirzayanov

Tutorial
Solution

1790B - Taisia and Dice

Idea: Gornak40

Tutorial
Solution

1790C - Premutation

Idea: MikeMirzayanov

Tutorial
Solution

1790D - Matryoshkas

Idea: MikeMirzayanov

Tutorial
Solution

1790E - Vlad and a Pair of Numbers

Idea: Vladosiya

Tutorial
Solution

1790F - Timofey and Black-White Tree

Idea: molney

Tutorial
Solution

1790G - Tokens on Graph

Idea: senjougaharin

Tutorial
Solution
  • Vote: I like it
  • +81
  • Vote: I do not like it

| Write comment?
»
14 months ago, # |
  Vote: I like it +26 Vote: I do not like it

I think F is interesting, many ways to solve

»
14 months ago, # |
  Vote: I like it +5 Vote: I do not like it

For F I would recommend tourist solution .

  • »
    »
    14 months ago, # ^ |
    Rev. 3   Vote: I like it +1 Vote: I do not like it

    It's TC is nlogn?

    Edit: It's TC is nlogn '.'

    • »
      »
      »
      14 months ago, # ^ |
      Rev. 2   Vote: I like it +47 Vote: I do not like it

      It actually is O(n log n). Without going into the details of the solution, it's easy to see from the code that its time complexity is O(sum of answers for all operations), i.e. the sum of all numbers that we are supposed to output.

      Claim: after k operations, the minimum length of path between black nodes is at most 2 * (n / k), for k > 0.

      Proof by contradiction: Suppose that the minimum path length is greater than 2 * (n / k). Then, for every black vertex v there are at least (n / k) nodes u such that dist(u, v) <= n / k, and by the assumption all these u's must be white, because (n / k) < 2 * (n / k). Let cnt(v) denote the exact number of these white nodes for vertex v. Let S = sum of cnt(v) across all black v. I claim that no white vertex is counted twice in S. Because if it is, then it has a distance of at most (n / k) to 2 different black nodes; but then the distance between those 2 black nodes is at most 2 * (n / k), which contradicts our original assumption. Thus, S counts each white vertex at most once, and is therefore bounded by (n — k). But we also know that each of the k black vertices contributes at least (n / k) to S. Then,

      S <= n — k --> k * (n / k) <= (n — k) --> k <= 0, which is a contradiction. This finishes the proof.

      Now the TC of the algorithm is the sum of 2 * (n / k) for all k from 1 to n. But this is just a sum of harmonic series, the value of which is O(n log n).

      • »
        »
        »
        »
        14 months ago, # ^ |
          Vote: I like it +3 Vote: I do not like it

        Finally understood the proof completely after trying to upsolve this problem for 1 day :)

      • »
        »
        »
        »
        14 months ago, # ^ |
        Rev. 4   Vote: I like it +32 Vote: I do not like it

        Another way to prove your claim is by thinking about the euler tour of a tree.

        Intuition: It's straightforward to calculate the maximum value of the minimum distance between any two elements after some $$$K$$$ operations in the case of a list instead of a tree. So let's try to represent the "tree like a list".

        Your claim would have been trivial if we were given a list instead of a tree in the problem. You can "transform" a tree and represent it using the list of nodes visited during the euler tour of the tree. The difference of position between any two elements in the list will always be greater or equal to the actual distance between the nodes (that are being represented through those elements) in the tree. Like in this case, the difference in position between nodes 6 and 1 is 4, whereas the actual distance is 3.

        Image

        Now we have a list of $$$2N - 1$$$ elements, and we know that even if these elements were independent of each other (same node does not appear in the list twice), the answer would be bounded by $$$\lceil\frac{2N - 1}{K}\rceil$$$.

        Note The bound works for the tree as well because we know whatever the answer for our list is, the actual answer for our tree is lesser or equal.

        • »
          »
          »
          »
          »
          14 months ago, # ^ |
            Vote: I like it +3 Vote: I do not like it

          This is indeed very simple, thanks a lot for the alternative approach!

»
14 months ago, # |
  Vote: I like it 0 Vote: I do not like it

the round was great thank you so much!

»
14 months ago, # |
  Vote: I like it 0 Vote: I do not like it

when will system tests take place?

»
14 months ago, # |
Rev. 2   Vote: I like it -8 Vote: I do not like it

Is this contest unrated?

  • »
    »
    14 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    Looks like it.

  • »
    »
    14 months ago, # ^ |
    Rev. 2   Vote: I like it 0 Vote: I do not like it

    No.(Maybe the system test isn't started is because there's still someone hacking.)

  • »
    »
    14 months ago, # ^ |
      Vote: I like it +5 Vote: I do not like it

    it isn't, updating rating for div3/4/edu rounds takes a while

»
14 months ago, # |
  Vote: I like it 0 Vote: I do not like it

All problems are interesting! Like them. But a little difficult for div 3:)

  • »
    »
    14 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    I thought that ABCDE were pretty easy, but FG were too hard

  • »
    »
    14 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    Will You Please Check If my solution is correct for problem E as i did it in O(1) Complexity 190841912

»
14 months ago, # |
Rev. 3   Vote: I like it 0 Vote: I do not like it

On task B, the values of the elements are bounded from $$$1$$$ to $$$6$$$. So, even if you set everything to $$$s-r$$$ and then subtract manually, the number of operations in the worst case is $$$5n-5$$$, which is still $$$O(n)$$$.

  • »
    »
    14 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    vector res is sorted as well so it should be O(nlogn)

    • »
      »
      »
      14 months ago, # ^ |
        Vote: I like it -7 Vote: I do not like it

      Nope, my solution goes for an entirely different approach. I just start with $$$s-r$$$ on every index, and loop through $$$[1,n)$$$ (0-indexed), subtracting $$$1$$$ everytime until we get $$$s$$$ sum in total. The worst case time complexity is $$$O(n)$$$ due to the limit in the elements' values.

      • »
        »
        »
        »
        14 months ago, # ^ |
          Vote: I like it 0 Vote: I do not like it

        okay I thought you were talking about the solution given in tutorial its complexity is given as O(N*N). Even I went with a similar approach that is printed s-r until sum becomes less than s-r then divided the sum left(if any) on the remaining number of elements to be printed. So the TC was O(N) in my approach.

    • »
      »
      »
      14 months ago, # ^ |
        Vote: I like it +1 Vote: I do not like it

      My solution for task B: 190928632 complexity: O(n)

»
14 months ago, # |
  Vote: I like it +2 Vote: I do not like it

D is harder than E . E was quite Simple.

»
14 months ago, # |
Rev. 2   Vote: I like it +29 Vote: I do not like it

F is a standard problem for centroid decomposition — build centroid tree, process two types of requests — add a new black vertex and find closest.

When we insert a new black vertex, just push into all vertexes before us the distance to that centroid.

When we get the distance, just iterate over all ancestors and relax ans with min(ans, d(v, cent) + mn[cent])

Working code: https://codeforces.com/contest/1790/submission/190980021

  • »
    »
    14 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    what does relax ans mean?

    • »
      »
      »
      14 months ago, # ^ |
        Vote: I like it +1 Vote: I do not like it

      If $$$ans_i$$$ is on the i-th iteration, we take minimum of previous $$$ans_{i - 1}$$$ and the current calculated minimum distance.

  • »
    »
    14 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    https://codeforces.com/contest/1790/submission/190946802 Can you tell me the time complexity of f solution and optimize this solution?? btw let me tell my solution:

    Initially what I did is I found the min distance of all the node from black node and stored in an array.

    Then during any query I found minimum distance between two black nodes which takes order of 1 time then accordingly updates the minimum distance of the nodes using BFS and update until new distance is smaller then minimum

»
14 months ago, # |
  Vote: I like it 0 Vote: I do not like it

Can someone help me understand why my problem F will TLE?

My idea is to go through each black node and do a dfs up to answer steps, (answer is the min distant so far) and see if I can find another black node that is closer than answer. If I find one, update answer so that next time we can dfs less depth. However, this will tle at test case 18. I thought the time complexity is O(n^1.5), please point out my mistake.

My submission 190916688

Thank you for your help.

  • »
    »
    3 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    Here is a counter use case that will lead to O(n^2) for this method. A star graph with 1 in the middle and all other nodes connect to 1. Color everything other than 1 to black, then you will see every time, we need to traversal all the edges connect to 1, which lead to O(n^2).

    So we need to update the dist[curr] to be the shortest to black node and stop if keep going will not make a better answer. Above case, we should stop at 1 every time because keep going to 1's neighbor will not make things better.

»
14 months ago, # |
  Vote: I like it 0 Vote: I do not like it

Can someone explain tourist solution for problem f in detail

»
14 months ago, # |
  Vote: I like it 0 Vote: I do not like it

To anyone who has understood tourist's solution, please correct me if I am wrong

Whenever we convert a node N1 black, we consider it's ancestors in hierarchical order and recompute their best distance as the min of the distance from node N1 and it's previous best distance until we encounter the root of the tree or we encounter an ancestor which has best distance better than it's distance from N1 and we break( because all the rest ancestors can't possibly be benifited from node N1 ) or we reach an ancestor which has distance from N1 greater than current ans(as it will never ever contribute to our actual answer as the answer always decreases or stays the same)

IDK about it's TC tho

»
14 months ago, # |
  Vote: I like it 0 Vote: I do not like it

There's a different solution for problem C. link

»
14 months ago, # |
Rev. 3   Vote: I like it 0 Vote: I do not like it

https://codeforces.com/contest/1790/submission/190946802 Can any body tell me the time complexity of f solution and optimize this solution?? btw let me tell my solution:

Initially what I did is I found the min distance of all the node from black node and stored in an array.

Then during any query I found minimum distance between two black nodes which takes order of 1 time then accordingly updates the minimum distance of the nodes using BFS and update until new distance is smaller then minimum

»
14 months ago, # |
  Vote: I like it 0 Vote: I do not like it

Please start the system testing. Why does it take so long when other platforms give the new ratings 15 mins after the end contest.

»
14 months ago, # |
Rev. 3   Vote: I like it +1 Vote: I do not like it

E can be done with binary search too

  • »
    »
    14 months ago, # ^ |
    Rev. 3   Vote: I like it 0 Vote: I do not like it

    here is my implementaion 191020389

  • »
    »
    6 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    for binary search doesn't f(x) have to be monotonic? sorry I'm not understanding how bs won't get stuck somewhere. In this case

    $$$\frac{X}{2}$$$

    is a solution for a so bs will always try it but how do you prove it without knowing the solution beforehand?

    • »
      »
      »
      6 months ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      we have fixed number of bits i.e. of x, now we will alway assign a<=b such that a+b=2*x, now if we increase a b will decrease and also xor value of both will decrease, which means xor value is monotic decreasing with respect to a, hence we apply binary search l=1 and r=x, for the value of a .

      • »
        »
        »
        »
        6 months ago, # ^ |
          Vote: I like it 0 Vote: I do not like it

        if x=6, and a=2 and b=10, a^b=8. now if increase a by 1, a=3, b=9. and a^b=10. so on increasing a => a^b increases. now we increase a by 2. 5^7= 2. therefore a^b is not monotonic so not a candidate for binary search

        • »
          »
          »
          »
          »
          6 months ago, # ^ |
          Rev. 2   Vote: I like it 0 Vote: I do not like it

          .

          • »
            »
            »
            »
            »
            »
            6 months ago, # ^ |
              Vote: I like it 0 Vote: I do not like it

            Let me think a better explanation give me 5 min

        • »
          »
          »
          »
          »
          6 months ago, # ^ |
            Vote: I like it 0 Vote: I do not like it

          I am applying binary search on a

        • »
          »
          »
          »
          »
          6 months ago, # ^ |
            Vote: I like it 0 Vote: I do not like it

          If a,b pair exists when x=y for a being mid y=mid^(2*x-mid) and if x>y Then we have to increase xor it can only be done by decreasing a as told u earlier we are finding solution a<=b and if xor of a nd b to increase a should be decreased and b should be increased,and vice-versa for y<x

»
14 months ago, # |
  Vote: I like it +3 Vote: I do not like it

Can someone tell me in detail why the time complexity of the F of √n question came about?

»
14 months ago, # |
Rev. 2   Vote: I like it +2 Vote: I do not like it

I was asked to give more details on my solution of 1790E - Vlad and a Pair of Numbers. I will also share it there.

Solution for problem E
  • »
    »
    14 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    It can be easily done with binary search in same complexity

  • »
    »
    6 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    the most understandable tutorial, thank you

»
14 months ago, # |
Rev. 2   Vote: I like it +4 Vote: I do not like it

Is it possible to do F using a BFS? If I understand the tutorial correctly, we skip vertices v that have d[v] > ans. Similarly, we could do a BFS where the number of levels you visit does not exceed ans. Will this be as efficient as the DFS approach? Why or why not?

»
14 months ago, # |
  Vote: I like it +4 Vote: I do not like it

Can someone please explain solution of F more clearly. I didn’t understand the editorial solution that why it works and also the time complexity

»
14 months ago, # |
  Vote: I like it +4 Vote: I do not like it

I am really disappointing because its my first contest and i solved only two and i get only one right . but i think there is something wrong in q B-Taisia and Dice because in the q ( in any order /print any) and i got it right. like here : 3 12 6 my answer was : 6 3 3 3 + 3 = 6 and 6+ 6 = 12 and that is 3 dices. I am really disappointing but hey ! is my first one . thank you :) <3

»
14 months ago, # |
  Vote: I like it +3 Vote: I do not like it

So, I was thinking when to show my way to solve problems and I think that it's best if I do it only when at least I can solve three problems. So, like always don't focus on my code because it has less value that what I was thinking in the contest.

A
B
C
D

Hope that my way of thinking can help you in some kind of way to your next contest.

»
14 months ago, # |
  Vote: I like it +1 Vote: I do not like it

Alternate appraoach for D: Divide and Conquer — https://codeforces.com/contest/1790/submission/190924408

Like merge-sort, but instead the merge function calculates number of matryoshka sets that can be combined across left and right intervals.

»
14 months ago, # |
  Vote: I like it 0 Vote: I do not like it

Could someone tell me why use unordered_map in D will TLE and map will not? unordered_map code:191060893 map code:191061543

»
14 months ago, # |
  Vote: I like it 0 Vote: I do not like it

I really don't understand the tutorial of problem F !! can anyone explain it in more details and prove its time complexity $$$O(n\sqrt{n})$$$ ?

»
14 months ago, # |
Rev. 3   Vote: I like it 0 Vote: I do not like it

I wonder how the bound of $$$O(n \sqrt{n})$$$ with DFS/BFS is proven. Suppose that we only update nodes with distance smaller than $$$2$$$. Even in such case, we can visit all $$$n$$$ nodes on certain graphs. Even if we know that $$$dist \le \sqrt{n}$$$, how can we ensure the number of visited nodes?

  • »
    »
    13 months ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    Exactly I have the same question. Even if the max distance is √n, for every vertex we have to visit √n*edges connect to the vertex which overall sum to what I dont know(but how can you say it of order n root n),plus I think dfs is working correct because it randomly choses vertex and replaces ans which in turn reduces total actual iterations of the vertex. I am saying this beacause I remember a dude complaing his code was giving TLE at 7th test case with bfs but it got AC with dfs.

»
13 months ago, # |
  Vote: I like it 0 Vote: I do not like it

With regard to my comment above, I would like to request a thorough check on the given solution of problem 1790F. I am a bit confident that the solution isn't correct (i.e. it may not run in $$$O(n \log n)$$$ or $$$O(n \sqrt{n})$$$ for all trees). Thanks a lot!

»
8 months ago, # |
  Vote: I like it 0 Vote: I do not like it

Can someone explain more solution of E?

»
8 months ago, # |
  Vote: I like it 0 Vote: I do not like it

Why i am getting MLE for problem D on first test case?CODE

»
8 months ago, # |
  Vote: I like it 0 Vote: I do not like it

Hi, could someone help me with my solution to 1790B? Test case 1 says that my solution doesn't pass because it doesn't add up to 15, but my solution outputs 6 3 3 3, which does add up to 15?

»
6 months ago, # |
Rev. 4   Vote: I like it 0 Vote: I do not like it
DFS-ish solution for problem D. Matryoshkas

First, we build a frequency map as any approach would need to care of the counts of each toy. Here each key represents a toy-size and assosciated value represents number of such toys having the same size.

If we define a directed graph such that:-

  • The available sizes of the toys are the nodes.
  • There only exists either zero or one directed edge outgoing from any node.
  • Any and every outgoing edge from a node always goes from a node representing a toy of size $$$S$$$ to the one representing a size of $$$S+1$$$.
  • If we run a DFS, we only have one neighbour to try-out per node. (Simply look up in the frequency map, if there is a toy of size $$$source+1$$$ available or not, which is to say frequency of $$$source+1$$$ is nonzero or not ?)

We sort the available sizes(the input array) and start the DFS with the smallest size as the source of the dfs. Obviously the smallest toys are the toys that don't have any smaller neighbour to "try-out", thus it makes sense to start at the smallest toys. Since this is a directed graph and there is only one neighbour to "try-out" per node. We, don't need the usual visited array in DFS implementations. Instead, every time we visit a node $$$\implies$$$ we put one toy of that size in a set. Thus, we update the frequency map by subtracting 1.

Ultimately, the number of connected components found by the DFS is the minimum number of sets that can be formed with the toys.

Sample code: 223730733

»
6 months ago, # |
Rev. 2   Vote: I like it 0 Vote: I do not like it

I don't understand in G why this test case expects NO:

test

There is a path from 3 to 1 all having bonus, so we can reach 1 using rules. Can someone explain? Also sorry for necroposting

»
4 months ago, # |
  Vote: I like it 0 Vote: I do not like it

Problem C is the worst problem I've seen so far. ruined my day thanks

»
3 months ago, # |
  Vote: I like it 0 Vote: I do not like it

Can we solve F with Square Root Decompos ?

»
2 months ago, # |
  Vote: I like it 0 Vote: I do not like it

241357745

why I am getting TLE ? can anyone pls help me? thanks in advance