Блог пользователя bablu_45

Автор bablu_45, история, 5 лет назад, По-английски

Given a 1-d array candy crush, return the shortest array after removing all the continuous same numbers (the no of occurences of repeating number >= 3)

input: 1-d array [1, 3, 3, 3, 2, 2, 2, 3, 1]

return: [1, 1] (Explanation:- 133322231->133331->11).

133322231->122231->131 gives answer as [131] which is not the shortest.

Time complexity should be better than O(n^2)

  • Проголосовать: нравится
  • +31
  • Проголосовать: не нравится

»
5 лет назад, # |
Rev. 3   Проголосовать: нравится -19 Проголосовать: не нравится

First, compress the given array into a vector of pairs where the first value in pair denote the element and second value denotes the frequency. Now no adjacent pair has the same value in the vector. For example, the vector for your array will be

{(1, 1), (3, 3), (2, 3), (3, 1), (1, 1)}

Next, make a stack of pairs and insert the pairs from the vector into the stack considering the below algo

  1. If the frequency of the current pair is greater than 2 don't add.

  2. Else if the stack is not empty and the current element is the same as the element at the top of the stack then just update the frequency of the pair at the top of the stack and check if the new frequency is smaller than 3. If not pop it out. In both cases don't add the current pair from the vector.

  3. Else add the current pair.

Now you can expand the vector to obtain your resultant array. Time complexity is O(n).

  • »
    »
    5 лет назад, # ^ |
    Rev. 2   Проголосовать: нравится +8 Проголосовать: не нравится

    It doesn't work:

    If the vector is 1, 1, 2, 2, 2, 1, 2, 2, 2, 1

    Then the 'compressed vector' will be (1, 2), (2, 3), (1, 1), (2, 3), (1, 1)

    The algorithm you described will do the following:

    1. Add to the stack (1, 2)

    2. Ignore (2, 3)

    3. Update the pair (1, 2) with the pair (1, 1) and remove them

    4. Ignore (2, 3)

    5. Add to the stack (1, 1)

    So your algorithm is unable to clear the vector.

    A correct solution would be to pop both (2, 3) and be left with (1, 2), (1, 1), (1, 1) which can be removed together.

    • »
      »
      »
      5 лет назад, # ^ |
        Проголосовать: нравится 0 Проголосовать: не нравится

      What to do in this case:

      Vector is 2, 2, 2, 1, 1, 1, 2, 2

      The correct answer is empty vector.

»
5 лет назад, # |
Rev. 2   Проголосовать: нравится -27 Проголосовать: не нравится

write the input array in encoded form like this : (1,1),(3,3),(2,3),(3,1),(1,1) with frequencies

moving left to right

push the elements into stack one by one until you encounter a freq < 3 if(freq < 3 and stack is not empty) — pop the top until top's freq >=3 and its a different element — push the element with freq < 3 , if top is same as this element just update top's freq

do this process again starting from right to left

shorter string of both the processes will be your ans .

»
5 лет назад, # |
  Проголосовать: нравится +28 Проголосовать: не нравится

Okay folks, can anyone here solve it correctly in $$$O(n^2)$$$ time? I failed to do it even though I tried for a bit. :/

The best I can think of is some $$$O(n^3)$$$ sped up by bitsets.

  • »
    »
    5 лет назад, # ^ |
    Rev. 2   Проголосовать: нравится 0 Проголосовать: не нравится

    Edit: wrong.

    Maybe something like this?:

    Let's suppose we can only remove 3, 4, or 5 equal consecutive elements, this is still equivalent.

    $$$dp_{i, j} = 1$$$ iff we can completely remove the subarray $$$[i, j]$$$.

    To compute the transition, we must remove element $$$A_j$$$. If we intend to remove $$$A_j$$$ with 2 more elements at indicies $$$x, y (x < y)$$$, then we must first be able to remove $$$[x + 1, y - 1]$$$ and $$$[y + 1, j - 1]$$$, then we remove the 3 elements, and then we should remove $$$[i, x - 1]$$$, so these are the subproblems we should look at. If we want to remove $$$k$$$ elements, there are $$$k$$$ subproblems to look at. Since we only allow removing 3, 4 or 5, the amount of time to compute $$$dp_{i, j}$$$ is $$$O(1)$$$.

    Onto the original problem, if we mark all non removed elements as "special", then between adjacent special indicies we must remove the whole subarray. If we let $$$D_i$$$ be the minimum amount of special indicies we can keep in prefix $$$[0, i]$$$, then the transition is:

    $$$D_j = \min(D_{j - 1} + 1, D_{i - 1})$$$ for all $$$i$$$ such that $$$dp_{i, j} = 1$$$.

    Seems too simple, am I missing something?

    • »
      »
      »
      5 лет назад, # ^ |
        Проголосовать: нравится 0 Проголосовать: не нравится

      Since we only allow removing 3, 4 or 5, the amount of time to compute dpi,j is O(1).

      How to do this?

      • »
        »
        »
        »
        5 лет назад, # ^ |
        Rev. 3   Проголосовать: нравится 0 Проголосовать: не нравится

        For each index $$$j$$$ precompute the last index $$$i$$$ such that $$$A_i = A_j$$$, then we can obtain the last 4 occurences of a value in $$$O(1)$$$. Deciding to remove 3, 4 or 5 then considers $$$O(1)$$$ subproblems.

        Edit: Now I see the issue, we don't necessarily group $$$A_j$$$ with the ones immediately before it. My bad.

        • »
          »
          »
          »
          »
          5 лет назад, # ^ |
            Проголосовать: нравится 0 Проголосовать: не нравится

          Yup, there might be values in the middle that are equal to $$$A_j$$$ but we removed them earlier. Like in $$$[1, 2, 2, 1, 1, 1, 2, 1, 1]$$$.

»
5 лет назад, # |
Rev. 3   Проголосовать: нравится +17 Проголосовать: не нравится

This was also posted in Leetcode forum yesterday, link. I described my $$$O(n^3)$$$ approach in short. Other users keep proposing some stupid map or stack solutions that don't make sense. I spent some time thinking and can't even get $$$O(n^2)$$$. My guess is that the question was supposed to say that values disappear from left to right (or the other way around). Then there is a linear solution with stack.

I'm copying a short description of my $$$O(n^3)$$$ solution:

Spoiler
  • »
    »
    5 лет назад, # ^ |
    Rev. 2   Проголосовать: нравится +27 Проголосовать: не нравится

    I think someone out there may be having a little fun, I've noticed these type posts in recent days:

    1.) Someone with no history on the platform.

    2.) Posts a question claiming its from an interview.

    3.) It looks easy and gets alot of bogus "solutions".

    4.) Has a constraint that makes the question impossible. Like saying you can only have O(1) extra memory when any known solution requires O(N) memory. Or this one requires less than O(N^2) complexity when even O(N^3) is difficult.

    Unless these companies have radically changed their interviewing techniques recently I doubt they would ask these questions of candidates. They typically ask questions even low-blue coders can easily do on a whiteboard.

    • »
      »
      »
      5 лет назад, # ^ |
        Проголосовать: нравится +16 Проголосовать: не нравится

      so we are getting pranked?

      • »
        »
        »
        »
        5 лет назад, # ^ |
        Rev. 2   Проголосовать: нравится +8 Проголосовать: не нравится

        I dunno for sure, just a hunch. I saw another post recently and I was like oh that's easy, then I sat down and was like, wait a minute, that's actually quite difficult I can't actually come up with a solution despite how easy it sounds. And it was the same modus operandi, a user with no history, saying it was an [insert big name company here] interview question, and then posing a problem that looks so easy but turns out to be hard. Not that I'm some great intellect, just because I can't solve it doesn't prove anything, but if Errichto and mnbvmar each spent some time thinking about it and couldn't come up with better than O(N^3) then I'd say it probably wasn't a telephone interview question from Google.

        Or it could be, and I have seen this before, someone doesn't understand the question fully and leaves out a tidbit, a constraint that makes it solvable, like Errichto pointed out, being required to always take from beginning or end greedily makes it an easy stack based solution.

        And to put it all into perspective here is my Google phone interview question:

        void sum(vector<int>& a, int w){
             // put code here ...
        }
        

        Given a matrix in row-major form, where w is the width of each row, in-place alter the matrix such that each a[i][j] is equal to the sum of all elements a[0..i][0..j] inclusive of the original array. Do it optimally, give your complexity.

        So I start out, I know it must be linear because O(N^2) is so obvious, and it looks like a beginner DP question, OK it feels like I can do

        #include <bits/stdc++.h>
        using namespace std;
        
        void sum(vector<int>& a, int w){
           int n = a.size()/w, m = w;
           for(int i=1;i<n;i++) a[i*w]+=a[(i-1)*w];
           for(int j=1;j<m;j++) a[j]+=a[j-1];
           for(int i=1;i<n;i++) for(int j=1;j<m;j++) 
              a[i*w+j] += a[(i-1)*w+j] + a[i*w+j-1];
        }
        

        I think about it, work some cases out by hand (you have no compiler or IDE, you have to type in a hangout) and realize I get too high of numbers. So what am I doing, what am I counting too much? Finally after 5 minutes (but it felt like eternity) it hit me, the overlapping area of the rectangles in upper left corner I'm summing twice, so I need to subtract that area.

        #include <bits/stdc++.h>
        using namespace std;
        
        void sum(vector<int>& a, int w){
           int n = a.size()/w, m = w;
           for(int i=1;i<n;i++) a[i*w]+=a[(i-1)*w];
           for(int j=1;j<m;j++) a[j]+=a[j-1];
           for(int i=1;i<n;i++) for(int j=1;j<m;j++) 
              a[i*w+j] += a[(i-1)*w+j] + a[i*w+j-1] - a[(i-1)*w+j-1];
        }
        

        While I was trying to figure out what I was double-counting I said, "Sorry this is a really easy question I should have had this by now." And the interviewer said, "Oh no, this isn't easy at all, this is a hard question." So they consider this type of questions to be tricky, even a blue coder like me I literally was done in 15 minutes.

        So I doubt they'd throw up a question like the one given by OP in a phone interview, especially since the guy said it had to be better than O(N^2), in my mind that just seals it as a trolling operation.

        • »
          »
          »
          »
          »
          5 лет назад, # ^ |
            Проголосовать: нравится 0 Проголосовать: не нравится

          I think you're right, all the interview questions I ever got were also very easy compared to original poster's problem.

          So basically we are getting trolled by a guy with some questions that look jokes easy but are actually impossible. Good detective work! I wouldn't have thought of something like that.

          • »
            »
            »
            »
            »
            »
            5 лет назад, # ^ |
              Проголосовать: нравится +9 Проголосовать: не нравится

            Even if they are trolling, its still interesting how easy it sounds and how hard it is in practice. So at least they have some pinache and flair in their trolling efforts.

            And, maybe, they aren't trolls. Could be I'm just a paranoid who ascribes a sense of humor to incompetence. But I don't know, I tend to favor the conspiratorial outlook whenever the opportunity presents, it makes the world more interesting.

    • »
      »
      »
      5 лет назад, # ^ |
        Проголосовать: нравится +13 Проголосовать: не нравится

      Most difficult part in interviews are sometimes clarification questions. Some interviewers ask really stupid questions and expect people to ask about missing constraints.

    • »
      »
      »
      4 года назад, # ^ |
        Проголосовать: нравится 0 Проголосовать: не нравится

      maybe he is pranking but this question was asked in my coding interview first round, and I could only solve it only partially, and since not many people could make through this question, I was shortlisted for the next round. Anyway, I am still looking for the most optimal and correct solution.