Given a 1-d array candy crush, return the shortest array after removing all the continuous same numbers (the no of occurences of repeating number >= 3)

input: 1-d array [1, 3, 3, 3, 2, 2, 2, 3, 1]

return: [1, 1] (Explanation:- 133322231->133331->11).

133322231->122231->131 gives answer as [131] which is not the shortest.

Time complexity should be better than O(n^2)

kosomk !!

First, compress the given array into a vector of pairs where the first value in pair denote the element and second value denotes the frequency. Now no adjacent pair has the same value in the vector. For example, the vector for your array will be

{(1, 1), (3, 3), (2, 3), (3, 1), (1, 1)}

Next, make a stack of pairs and insert the pairs from the vector into the stack considering the below algo

If the frequency of the current pair is greater than 2 don't add.

Else if the stack is not empty and the current element is the same as the element at the top of the stack then just update the frequency of the pair at the top of the stack and check if the new frequency is smaller than 3. If not pop it out. In both cases don't add the current pair from the vector.

Else add the current pair.

Now you can expand the vector to obtain your resultant array. Time complexity is O(n).

It doesn't work:

If the vector is

`1, 1, 2, 2, 2, 1, 2, 2, 2, 1`

Then the 'compressed vector' will be

`(1, 2), (2, 3), (1, 1), (2, 3), (1, 1)`

The algorithm you described will do the following:

Add to the stack

`(1, 2)`

Ignore

`(2, 3)`

Update the pair

`(1, 2)`

with the pair`(1, 1)`

and remove themIgnore

`(2, 3)`

Add to the stack

`(1, 1)`

So your algorithm is unable to clear the vector.A correct solution would be to pop both

`(2, 3)`

and be left with`(1, 2), (1, 1), (1, 1)`

which can be removed together.What to do in this case:

Vector is

`2, 2, 2, 1, 1, 1, 2, 2`

The correct answer is empty vector.

sir, this is very briilliant approach . can you pls prove to me your greedy approach?

write the input array in encoded form like this : (1,1),(3,3),(2,3),(3,1),(1,1) with frequencies

moving left to right

push the elements into stack one by one until you encounter a freq < 3 if(freq < 3 and stack is not empty) — pop the top until top's freq >=3 and its a different element — push the element with freq < 3 , if top is same as this element just update top's freq

do this process again starting from right to left

shorter string of both the processes will be your ans .

Try

`4,6,6,5,5,5,6,4,4,4,1,1,1,3,2,2,2,3,3,1`

Yup it fails ..

use can using Stack for this problem.

Okay folks, can anyone here solve it

correctlyin $$$O(n^2)$$$ time? I failed to do it even though I tried for a bit. :/The best I can think of is some $$$O(n^3)$$$ sped up by bitsets.

Edit: wrong.

Maybe something like this?:

Let's suppose we can only remove 3, 4, or 5 equal consecutive elements, this is still equivalent.

$$$dp_{i, j} = 1$$$ iff we can completely remove the subarray $$$[i, j]$$$.

To compute the transition, we must remove element $$$A_j$$$. If we intend to remove $$$A_j$$$ with 2 more elements at indicies $$$x, y (x < y)$$$, then we must first be able to remove $$$[x + 1, y - 1]$$$ and $$$[y + 1, j - 1]$$$, then we remove the 3 elements, and then we should remove $$$[i, x - 1]$$$, so these are the subproblems we should look at. If we want to remove $$$k$$$ elements, there are $$$k$$$ subproblems to look at. Since we only allow removing 3, 4 or 5, the amount of time to compute $$$dp_{i, j}$$$ is $$$O(1)$$$.

Onto the original problem, if we mark all non removed elements as "special", then between adjacent special indicies we must remove the whole subarray. If we let $$$D_i$$$ be the minimum amount of special indicies we can keep in prefix $$$[0, i]$$$, then the transition is:

$$$D_j = \min(D_{j - 1} + 1, D_{i - 1})$$$ for all $$$i$$$ such that $$$dp_{i, j} = 1$$$.

Seems too simple, am I missing something?

How to do this?

For each index $$$j$$$ precompute the last index $$$i$$$ such that $$$A_i = A_j$$$, then we can obtain the last 4 occurences of a value in $$$O(1)$$$. Deciding to remove 3, 4 or 5 then considers $$$O(1)$$$ subproblems.

Edit: Now I see the issue, we don't necessarily group $$$A_j$$$ with the ones immediately before it. My bad.

Yup, there might be values in the middle that are equal to $$$A_j$$$ but we removed them earlier. Like in $$$[1, 2, 2, 1, 1, 1, 2, 1, 1]$$$.

This was also posted in Leetcode forum yesterday, link. I described my $$$O(n^3)$$$ approach in short. Other users keep proposing some stupid map or stack solutions that don't make sense. I spent some time thinking and can't even get $$$O(n^2)$$$. My guess is that the question was supposed to say that values disappear from left to right (or the other way around). Then there is a linear solution with stack.

I'm copying a short description of my $$$O(n^3)$$$ solution:

SpoilerYou need dp with two 2-dimensional arrays (I guess that the first one can be removed, actually): bool can_remove[L][R]; int at_least[L][R]; // if you want to remove everything from interval [L, R] other than number x=a[L], then what is the minimum number of occurrences of x that you can leave? If that's impossible, let this be -1 instead. If it's at least 3, then mark can_remove[L][R] to true.

The idea for the second array comes from the fact that removing e.g. 5555 is possible if we had something like _ _ 5 _ _ _ 5 _ _ _ _ _ 5 _ _ _ _ 5 _ _. Each of intervals between 5's should be removable, and we need to combine it in some way.

Getting the answer from can_remove[L][R] is an easy part. Define dp[i] as the smallest number of remaining elements up to i and go from left to right doing dp[i] = min(dp[i], dp[i-1]+1) and for each removable [L, R] -- dp[R] = min(dp[R], dp[L-1]). The complexity of this part is O(N + REMOVABLE_INTERVALS) = O(N^2).

I think someone out there may be having a little fun, I've noticed these type posts in recent days:

1.) Someone with no history on the platform.

2.) Posts a question claiming its from an interview.

3.) It looks easy and gets alot of bogus "solutions".

4.) Has a constraint that makes the question impossible. Like saying you can only have O(1) extra memory when any known solution requires O(N) memory. Or this one requires less than O(N^2) complexity when even O(N^3) is difficult.

Unless these companies have radically changed their interviewing techniques recently I doubt they would ask these questions of candidates. They typically ask questions even low-blue coders can easily do on a whiteboard.

so we are getting pranked?

I dunno for sure, just a hunch. I saw another post recently and I was like oh that's easy, then I sat down and was like, wait a minute, that's actually quite difficult I can't actually come up with a solution despite how easy it sounds. And it was the same modus operandi, a user with no history, saying it was an [insert big name company here] interview question, and then posing a problem that looks so easy but turns out to be hard. Not that I'm some great intellect, just because I can't solve it doesn't prove anything, but if Errichto and mnbvmar each spent some time thinking about it and couldn't come up with better than O(N^3) then I'd say it probably wasn't a telephone interview question from Google.

Or it could be, and I have seen this before, someone doesn't understand the question fully and leaves out a tidbit, a constraint that makes it solvable, like Errichto pointed out, being required to always take from beginning or end greedily makes it an easy stack based solution.

And to put it all into perspective here is my Google phone interview question:

Given a matrix in row-major form, where w is the width of each row, in-place alter the matrix such that each a[i][j] is equal to the sum of all elements a[0..i][0..j] inclusive of the original array. Do it optimally, give your complexity.

So I start out, I know it must be linear because O(N^2) is so obvious, and it looks like a beginner DP question, OK it feels like I can do

I think about it, work some cases out by hand (you have no compiler or IDE, you have to type in a hangout) and realize I get too high of numbers. So what am I doing, what am I counting too much? Finally after 5 minutes (but it felt like eternity) it hit me, the overlapping area of the rectangles in upper left corner I'm summing twice, so I need to subtract that area.

While I was trying to figure out what I was double-counting I said, "Sorry this is a really easy question I should have had this by now." And the interviewer said, "Oh no, this isn't easy at all, this is a hard question." So they consider this type of questions to be tricky, even a blue coder like me I literally was done in 15 minutes.

So I doubt they'd throw up a question like the one given by OP in a phone interview, especially since the guy said it had to be better than O(N^2), in my mind that just seals it as a trolling operation.

I think you're right, all the interview questions I ever got were also very easy compared to original poster's problem.

So basically we are getting trolled by a guy with some questions that look jokes easy but are actually impossible. Good detective work! I wouldn't have thought of something like that.

Even if they are trolling, its still interesting how easy it sounds and how hard it is in practice. So at least they have some pinache and flair in their trolling efforts.

And, maybe, they aren't trolls. Could be I'm just a paranoid who ascribes a sense of humor to incompetence. But I don't know, I tend to favor the conspiratorial outlook whenever the opportunity presents, it makes the world more interesting.

Most difficult part in interviews are sometimes clarification questions. Some interviewers ask really stupid questions and expect people to ask about missing constraints.

what level of company we are talking about?