__Apocalypse__'s blog

By __Apocalypse__, history, 5 years ago, In English

My logic was the following. I created a frequency array, sorted it in decreasing order, considered its first k elements as the answer is obviously a subset of this. To find until what point I should consider I used the logic that if f[I]< f[0]/2 (found this intuitive) then I don't consider that element.This is my submission 57141020. Would be helpful if someone could provide a test case to prove why what I'm thinking is wrong. Thanks.

  • Vote: I like it
  • 0
  • Vote: I do not like it

»
5 years ago, # |
  Vote: I like it 0 Vote: I do not like it

Consider the following test case:

5 5
1 1 1 1 2

Even though the number of ones is more than 2 times the number of twos, it's still optimal to include that 2 in the final array. Under your assumption, it would be better to leave out the 2, instead having another 1, and your program's answer is 1 1 1 1 1. Another test case that breaks is

4 4
1 1 1 2

where your program outputs 1 2 1 2 due to integer division causing your variable tar to be 1.