rng_58's blog

By rng_58, history, 4 years ago, In English

We will hold Tokio Marine & Nichido Fire Insurance Programming Contest 2020.

The point values will be 100-200-500-700-800-1000.

We are looking forward to your participation!

  • Vote: I like it
  • +124
  • Vote: I do not like it

| Write comment?
»
4 years ago, # |
  Vote: I like it +32 Vote: I do not like it

How to Solve D?

  • »
    »
    4 years ago, # ^ |
      Vote: I like it +39 Vote: I do not like it

    Meet in the middle. Naive dp for the top 9 levels of the binary tree. If a query vertex has depth $$$\le 9$$$, we're done. Otherwise, brute force all $$$2^{9}$$$ subsets of the last $$$9$$$ ancestors and read the dp value from the suitable ancestor to get the answer for each query in $$$O(2^{9})$$$.

  • »
    »
    4 years ago, # ^ |
      Vote: I like it +8 Vote: I do not like it

    tl;dr. meet in the middle
    Store all the queries in the respective nodes, to answer them offline.
    Then run a single dfs on the tree.

    • While processing a node with height <= 9, compute a dp[height][w] — maximum sum of val with weight=w.

    • And while processing a node with height > 9, store all its ancestors(including itself) with height > 9, then iterate over all the stored ancestors and check for the maximum val for weight less than L.

    My submission

»
4 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Can someone give me hints for solving C? I came up with the bruteforce but that's obviously TLE

  • »
    »
    4 years ago, # ^ |
    Rev. 3   Vote: I like it +14 Vote: I do not like it

    After around LogN operations, the array becomes : [N, N, N, .., N].

    Now, it's easy. You could use fenwick tree/prefix sums to compute the array till then.

    • »
      »
      »
      4 years ago, # ^ |
        Vote: I like it +9 Vote: I do not like it

      how do you prove that it won't take more than LogN operations?

      • »
        »
        »
        »
        4 years ago, # ^ |
        Rev. 2   Vote: I like it +1 Vote: I do not like it

        I also stuck at that for 2 hours to prove that, it doesnt takes more operations but editorialist have described it nicely.

      • »
        »
        »
        »
        4 years ago, # ^ |
          Vote: I like it 0 Vote: I do not like it

        Intuitively think that's how it works and run the algorithm for maximum N and everything being 0 to confirm it experimentally.

»
4 years ago, # |
  Vote: I like it +10 Vote: I do not like it

In the editorial of problem E, the second solution mentions that number of ways to choose at least one number with 0 in that bit, or at least one number with 1 in that bit can be computed in $$$O(N + 2^{K}K)$$$ using inclusion-exclusion. How's that?

»
4 years ago, # |
  Vote: I like it 0 Vote: I do not like it

D knapsack tree, I do not get it.

"Thus, we can try all subsets of the items in the vertices at depth K or deeper in O(2^K) time."

How?

»
4 years ago, # |
  Vote: I like it +5 Vote: I do not like it

Are the test cases available?

»
4 years ago, # |
Rev. 3   Vote: I like it +19 Vote: I do not like it

In D, we can solve the problem when $$$L$$$ is large. To achieve that, for each query, we can take the list of all $$$O(2k)$$$ parents, and for each half build the list of all subsets sorted by weight in $$$O(2^k)$$$ (You can do it by merging in linear time sets $$$A$$$ and $$$A+x$$$ for each element $$$x$$$, to get $$$2^0+2^1+\ldots+2^k \to O(2^k)$$$). And then you can get the answer using two pointers.

In E, my sol was $$$O(3^L)$$$, I used the fact that $$$n \leq 50$$$ and inside the inclusion-exclusion maintained the set of all candidates in one long long.

  • »
    »
    4 years ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    How did you get all subsets sorted by weight in $$$O(2^k)$$$ ? I did exactly same thing, but I needed extra log for sorting subsets — it took me 1 hour to fit in time limit.

    • »
      »
      »
      4 years ago, # ^ |
      Rev. 2   Vote: I like it +16 Vote: I do not like it

      You don't need to sort all the subsets. That's what I wrote in my previous comment, let me elaborate it. Add elements $$$1,2, \ldots, i$$$, one by one, and maintain the set $$$A_i$$$ of all subsets sorted by weight. $$$A_{i+1} = (A_i \cup (A_i + x_i))$$$, so to get $$$A_{i+1}$$$ you can merge $$$A_i$$$ and $$$A_i + x_i$$$ in linear time, the total number of operations will be $$$1 + 2 + 4 + \ldots + 2^k \leq 2^{k+1}$$$.

»
4 years ago, # |
  Vote: I like it +3 Vote: I do not like it

I solved E in $$$O(3^L \cdot N/64 + NK)$$$. It's dumb inclusion-exclusion.

  • We can simplify the problem to "among the chosen integers, each bit must be 0 in at least one and 1 in at least one of them".
  • In inclusion-exclusion, we choose for each bit whether we want it to be 0, want it to be 1 or don't care. For each of these options, we need to get a bitmask of suitable integers, take its popcount $$$P$$$ and add $$$\pm \sum_{i=1}^K {P \choose i} = \pm cnt_P$$$.
  • For each bit $$$i$$$, we can find the bitmask $$$m_{i, 0}$$$ of integers that have this bit equal to 0 and its complement $$$m_{i, 1}$$$. The $$$3^L$$$ bitmasks we're looking for are generated as their AND, I'm sure you can see it.
  • The bitmasks can't be computed in a completely straightforward DP because $$$3^{18} \cdot 8$$$ bytes is more than 1 GB and it's also slightly too slow. Solution: compute it for the first $$$16$$$ bits and for each of these $$$3^{16}$$$ options, bruteforce the bitmask+popcount for the last $$$L-16$$$ bits — kinda like loop unrolling. This takes 9x less memory and 1.5 seconds for $$$L = 18$$$.
  • »
    »
    4 years ago, # ^ |
    Rev. 3   Vote: I like it 0 Vote: I do not like it

    You don't need these hacks with memory, just calculate the same thing with brute instead of DP...

    Check out my submission: link

»
4 years ago, # |
  Vote: I like it -8 Vote: I do not like it

If Anyone is stuck like me in C here is a detail explanation (maybe helpful to someone) In this link