CherryTree's blog

By CherryTree, 10 years ago, translation, In English

Hi everybody!

A new kind of contest starts today at HackerRank. The contest is named "Daily Challenge".

Each day you get to solve a problem, the difficulty increases as the week progresses. You have a weekend to solve the final problem and there will be five problems in total.

This time, I'm a writer of this contest, so I invite everybody to participate. After the contest, there will be an editorial on HackerRank.

Now the contest has started and you have about 20 hours left to solve the first, the easiest problem.

  • Vote: I like it
  • +36
  • Vote: I do not like it

»
10 years ago, # |
  Vote: I like it +10 Vote: I do not like it

Seems like login through Google is broken

»
10 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Is there going to be a "challenge" problem? I mean, there are usually lots of people who solve all the tasks in long contests, especially when full feedback is available. Or maybe some tiebreaking rules?

  • »
    »
    10 years ago, # ^ |
    Rev. 2   Vote: I like it +8 Vote: I do not like it

    There should be a systest at the end of each day, so I think that getting perfect score becomes quite harder that it was.

    I guess, it should be written in the rules somewhere..

»
10 years ago, # |
  Vote: I like it -8 Vote: I do not like it

Constraints in task C are crazy :) How to solve it?

  • »
    »
    10 years ago, # ^ |
    Rev. 2   Vote: I like it 0 Vote: I do not like it

    i'm also interested to know the solution. :)
    although i have this gut feeling that it will be . ;)

  • »
    »
    10 years ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    A helpful observation to make in this problem is that for every a_i and a_(i+1), at most one of these can be greater than sqrt(P). Using this with some dynamic programming gets you an O(N sqrt(P)) solution.

    • »
      »
      »
      10 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      It's a little bit unclear how to use this observation, can you please share the details? I also have an O( N * sqrt P ) solution, but it's based on another idea.

      • »
        »
        »
        »
        10 years ago, # ^ |
          Vote: I like it +8 Vote: I do not like it

        Let W(n, x) be the number of ways to make a sequence of n more elements if the previously used element is x. Also, let . We will find the final answer, W(N, 1), by using only W(n, x) where x ≤ Q.

        It should be clear that

        .

        But, we have a problem. Some of the values of i can be greater than Q. So, to fix this, we can split up the sum into two sums:

        .

        Note that this only works for x ≤ Q, but that's all we're worried about anyway. At this point it is nice to notice that W(n, x) ≥ W(n, x + 1), which is intuitive and can also be seen from above. Now, what is W(n, x) - W(n, x + 1)?

        .

        This sum can have a lot of terms, but if i > Q,

        can have at most Q different values of . Because of this, we can find W(n, x) - W(n, x + 1) for all the x < Q in amortized time.

        Does that explain clearly enough?

        By the way, it now appears to me that my solution is sort of complicated. Do you mind sharing your solution to the problem?

        • »
          »
          »
          »
          »
          10 years ago, # ^ |
            Vote: I like it +5 Vote: I do not like it

          Awesome, thanks.

          Turned out, that your solution is almost the same with mine. I also used the idea, that there are about sqrt(P) different values of the expression [P / x] for integer x from 1 to P.

          Here's my implementation: http://pastie.org/9112567

    • »
      »
      »
      10 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it
  • »
    »
    10 years ago, # ^ |
      Vote: I like it +13 Vote: I do not like it

    Something like this.

    • »
      »
      »
      10 years ago, # ^ |
        Vote: I like it +3 Vote: I do not like it

      You can also take a look to my post which sketches a simpler solution. Are you the problem setter?

      • »
        »
        »
        »
        10 years ago, # ^ |
          Vote: I like it +13 Vote: I do not like it

        Yeah. Actually, I'm the writer of the whole contest ^^"

        • »
          »
          »
          »
          »
          10 years ago, # ^ |
            Vote: I like it 0 Vote: I do not like it

          Until now it has been really rewarding for me, thank you, hope to see more contests from you.

»
10 years ago, # |
  Vote: I like it +16 Vote: I do not like it

When the system test for the last problem will begin?

»
10 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Can somebody describe his 140-pt solution to BST Maintenance?

Mine is based on centroid decomposition, but I have seen a lot of non-similar ideas in the solutions.

  • »
    »
    10 years ago, # ^ |
    Rev. 2   Vote: I like it +8 Vote: I do not like it

    My solution is based on heavy-light decomposition. Let's build the final binary search tree, we can do it in O(n log n), and build heavy-light decomposition for it. Then we will add vertexes one by one. Suppose we are adding vertex u at the current step. How to calculate the distances from u to all previous vertexes? For each old vertex(v) it will be distToRoot(u) + distToRoot(v) — 2 * distToRoot(lca(u, v)). So the most difficult part is to calculate the sum of distToRoot(lca(u, v)) for all previous vertexes. It will be the sum of the sizes of subtrees for all vertexes on the path from root to u(actually except u and root). We can do it with heavy-light decomposition and segment tree for sum, with updates on the segment.

    • »
      »
      »
      10 years ago, # ^ |
        Vote: I like it +8 Vote: I do not like it

      My solution is almost identical to yours, except that I computed the sums of distances in reverse order (from the last added node to the first one). This way we got an O(N * log^2(N)) solution.

      • »
        »
        »
        »
        10 years ago, # ^ |
          Vote: I like it 0 Vote: I do not like it

        I have a feeling this solution actually works in ... At least I couldn't think of counterexample.

        • »
          »
          »
          »
          »
          10 years ago, # ^ |
            Vote: I like it 0 Vote: I do not like it

          I spent some time sharpening my solution from O(n lg^2 n) to O(n lg n) and the runtime difference is only 0.2s. I used the centroid decomposition approach. The only difference is using a linear loop instead of sorting. Weird...

        • »
          »
          »
          »
          »
          10 years ago, # ^ |
            Vote: I like it +5 Vote: I do not like it

          I think you're right. I didn't fully analyze the time complexity. I mean, there are BSTs where, for one node, it is possible to spend O(log^2(N)) time with this approach. But summed up over all the nodes, I cannot find any example which takes more than O(N*log(N)).