My logic was the following. I created a frequency array, sorted it in decreasing order, considered its first k elements as the answer is obviously a subset of this. To find until what point I should consider I used the logic that if f[I]< f[0]/2 (found this intuitive) then I don't consider that element.This is my submission 57141020. Would be helpful if someone could provide a test case to prove why what I'm thinking is wrong. Thanks.

Consider the following test case:

Even though the number of ones is more than 2 times the number of twos, it's still optimal to include that 2 in the final array. Under your assumption, it would be better to leave out the 2, instead having another 1, and your program's answer is

`1 1 1 1 1`

. Another test case that breaks iswhere your program outputs

`1 2 1 2`

due to integer division causing your variable`tar`

to be 1.