darkshadows's blog

By darkshadows, 5 years ago, In English

Google Kick Start hosts online rounds throughout the year, giving participants the opportunity to test and grow their coding abilities while getting a sample of the programming skills needed for a technical career at Google. Top candidates might be invited for interviews.

I invite you to solve some fun and interesting problems on Sept 29 2019, 18:00 UTC.

Dashboard can be accessed here during the contest. Problem analyses will be published soon after the contest.

  • Vote: I like it
  • +58
  • Vote: I do not like it

| Write comment?
»
5 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Bump!

»
5 years ago, # |
  Vote: I like it 0 Vote: I do not like it

How to solve Flattening?

  • »
    »
    5 years ago, # ^ |
      Vote: I like it +6 Vote: I do not like it

    dp[i][j][k] = min cost upto element A[i] with j segments already and the value A[i] was changed to k

    Spoiler
    • »
      »
      »
      5 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      when you started building a segment, as dp[i][j + 1][A[i]] = min(dp[i][j + 1][A[i]], mn);, you are incrementing the segment even without taking care of pervious value of the array. This step reduced complexity of the problem also. Please explain.

      • »
        »
        »
        »
        5 years ago, # ^ |
          Vote: I like it 0 Vote: I do not like it

        In that transition, I open a new segment without changing A[i], so the previous value does not matter. Thus I take the minimum over all previous dp[i][j][v].

  • »
    »
    5 years ago, # ^ |
      Vote: I like it +16 Vote: I do not like it

    Here's my solution, which runs in $$$O(N^2K)$$$, which is slightly more efficient than the $$$O(NK A_i)$$$ solution given below (with the 15s time limit, the latter should still pass in time, but it may be slightly tight if implemented inefficiently).

    First, we create an array $$$cost$$$, which stores, for each range in the set, the minimum number of changes in height we need to make every value in this range the same. We can see that this is equal to the length of the range minus the number of times the most common value in the range appears, and we can compute this in $$$O(N^2)$$$ by iterating over each possible starting position for the range, maintaining the number of times each value has occurred in our range, and maintaining the most common value in the range. Afterwards, we can iterate over the ending positions and set the values of our cost array accordingly.

    Then, we let $$$dp[i][j]$$$ be the minimum number of edits we need to make the first $$$i$$$ elements of the array include at most $$$j$$$ changes in height. We can compute $$$dp[i][j]$$$ as the minimum value of $$$dp[k][j-1] + cost[k+1][i]$$$ over all $$$k$$$. There are $$$O(N)$$$ values of $$$k$$$ to check for each pair of $$$i$$$ and $$$j$$$, and we have $$$N$$$ values for $$$i$$$ and $$$K+1$$$ for $$$j$$$, so our total complexity is $$$O(N^2K)$$$.

    • »
      »
      »
      5 years ago, # ^ |
      Rev. 2   Vote: I like it 0 Vote: I do not like it

      @Geothermal how to solve in O(N*N*log(N)) using binary search and dynamic programming? (mention in editorial(analysis)).

    • »
      »
      »
      5 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      You don't even need to pre-calculate the cost for ranges. I have solved in $$$O(N^2*K)$$$ too, the states are $$$i, j, k$$$, where:

      • i: current position of the array
      • j: last not modified element
      • k: quantity of modified elements

      Then you just need to check transitions as:

      • Just change the current element, $$$solve(i + 1, j, k) + 1$$$
      • If $$$k < K$$$, change current element, $$$solve(i + 1, i, k + 1)$$$
      • If $$$A[i] = A[j]$$$, keep this element, $$$solve(i + 1, j, k)$$$

      Get the minimum of all possible transitions.

  • »
    »
    5 years ago, # ^ |
    Rev. 3   Vote: I like it +10 Vote: I do not like it

    We can convert A[i] to integers in the range [0 , n). This would reduce the complexity to O(n * n * k).

    Spoiler
    • »
      »
      »
      5 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      highbias can you plz state in your code what dp[i][j][k] represent?

      • »
        »
        »
        »
        5 years ago, # ^ |
          Vote: I like it 0 Vote: I do not like it

        dp[i][j][k]= Minimum edits we need to make for first i elements where the current element is j and the number of unsatisfied elements(A[i] != A[i + 1]) is k.

  • »
    »
    5 years ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    I used dp[i][j] where each state (i, j) represents minimum number of changes needed to build j segments (of equal elements) till index i. The final answer shall be minimum over all dp(n, j) where j ranges from 1 to k + 1 (If we have k + 1 segments => there are k changes). To calculate each state, (i, j) we loop from l = i to 1 and dp[i][j] = min(dp[i][j], dp[l][j — 1] + (l — i + 1 — max_count). Here, l represents the first element that will be present in the last segment. max_count is the maximum occurrence of any element in the range [l, i]. So basically, in the last segment, we will make all elements equal to the maximum occurring element.

    Code: https://ideone.com/i80LgQ

»
5 years ago, # |
  Vote: I like it +1 Vote: I do not like it

How to solve 'Teach Me'?

  • »
    »
    5 years ago, # ^ |
      Vote: I like it +1 Vote: I do not like it

    We do complementary counting. There are initially $$$N^2$$$ ordered pairs of workers. Then, we subtract those which aren't valid mentorship pairs. Create a map mapping sets of skills to the number of times they appear in the data. Then, for each set of skills, iterate over its subsets. If set Y is a subset of set X and X appears A times in the data while Y appears B times in the data, then we have $$$AB$$$ invalid ordered mentorship pairs, since the $$$B$$$ workers with the skillset $$$Y$$$ cannot mentor the $$$A$$$ workers with the skillset $$$X$$$. We thus subtract the value of $$$AB$$$ for each subset of each skillset.

    To evaluate our complexity, note that there are $$$N$$$ workers, each with $$$2^{C_i}$$$ subsets of its skillset. Moreover, each of these subsets takes $$$O(C_i)$$$ time to construct. If we use an unordered map to perform our count queries, our complexity is thus $$$O(N C_i 2^{C_i})$$$. I used a regular map in C++ to avoid the occasional hacks that plague unordered_map, and even with the extra $$$O(\log N)$$$ factor, this passed in time. (On Codeforces, my submission took just about eight seconds to solve a large test set twenty times, so it may have been a fairly tight squeeze, but it did seem to work out.)

  • »
    »
    5 years ago, # ^ |
    Rev. 2   Vote: I like it +5 Vote: I do not like it

    Straightforward solution using bitset passes without any problem

    • »
      »
      »
      5 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      Can you plz explain the solution? My bitset soln could only pass first test as its O(N^2)

      • »
        »
        »
        »
        5 years ago, # ^ |
        Rev. 2   Vote: I like it 0 Vote: I do not like it

        Let $$$havenot_i$$$ be a bitset of size $$$n$$$. For each $$$i \leq S$$$ we store a bitset, $$$havenot {_i} {_j}$$$ is $$$1$$$ if $$$ith$$$ person does not have $$$jth$$$ skill. Taking just bitwise $$$OR$$$ over the bits that we have gives us a set of people that we can mentor. So, just add the number of $$$1s$$$ in ORed bitset to our answer. Total time complexity is order of $$$O(\frac{N^2}{\omega} * c + N*S)$$$ (if I'm not mistaken), where $$$\omega$$$ is the constant that depends on the machine.

»
5 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Anyone have python solution for last question that doesnt get TLE? Or is it just impossible to AC for python

»
5 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Nice problems. Failed C-large since I used int somewhere :(

  • »
    »
    5 years ago, # ^ |
      Vote: I like it +6 Vote: I do not like it

    I had a similar issue--I accidentally computed $$$N^2$$$ before converting to long on problem B. Luckily, I caught my error while testing my program's runtime to make sure it would actually pass the large dataset, so I was able to resubmit before the end of the contest (though unfortunately, it did drop me from 3rd to 16th).

»
5 years ago, # |
  Vote: I like it 0 Vote: I do not like it

what is the expected time complexity for teach me ?

  • »
    »
    5 years ago, # ^ |
      Vote: I like it +3 Vote: I do not like it

    my code's complexity was O(32*N*Log(32*N)) but I got TLE..

    • »
      »
      »
      5 years ago, # ^ |
        Vote: I like it +4 Vote: I do not like it

      This may have been a constant factor issue, as I believe solutions involving the extra logarithmic can work (see my discussion above), but it's a pretty tight squeeze. That said, it's possible to cut out the logarithmic factor by using an unordered set/map instead of an ordered one.

      • »
        »
        »
        »
        5 years ago, # ^ |
          Vote: I like it 0 Vote: I do not like it

        I reduced the complexity to O(32*N*logN) and it passed..

»
5 years ago, # |
  Vote: I like it 0 Vote: I do not like it

How to solve Spectating Villages(Problem C) ?

  • »
    »
    5 years ago, # ^ |
      Vote: I like it +9 Vote: I do not like it

    The basic idea is tree DP. We can arbitrarily root the tree and store three DP values for each vertex. One stores the maximum score over the subtree rooted at this vertex if this vertex is not illuminated at all, one stores the same value if this vertex is illuminated by one of its children but does not contain a lighthouse, and one stores the same value if our vertex contains a lighthouse. The transitions get a little messy but can be computed efficiently for an $$$O(N)$$$ solution in total. Feel free to check out my submission on the Google Kickstart website for implementation details on the transitions.

    • »
      »
      »
      5 years ago, # ^ |
        Vote: I like it 0 Vote: I do not like it

      can you explain with dp relation?

»
5 years ago, # |
  Vote: I like it 0 Vote: I do not like it

In Problems it should be mentioned Sum of N for all test cases doesn't exceed Limit.

Like previous kickstart round it help to know in which complexity solution won't give TLE. Today in Flattening and Teach me it was really needed to make sure that code will not give TLE in larger input

  • »
    »
    5 years ago, # ^ |
      Vote: I like it 0 Vote: I do not like it

    Usually when that's not mentioned, it means you should be able to handle $$$T$$$ cases of the maximum size, but I'm not sure if that's the case here.

»
5 years ago, # |
  Vote: I like it +3 Vote: I do not like it

Can anyone explain how to solve Flattening in O(N^2log(n))? (referenced in official analysis)