Блог пользователя __Apocalypse__

Автор __Apocalypse__, история, 5 лет назад, По-английски

My logic was the following. I created a frequency array, sorted it in decreasing order, considered its first k elements as the answer is obviously a subset of this. To find until what point I should consider I used the logic that if f[I]< f[0]/2 (found this intuitive) then I don't consider that element.This is my submission 57141020. Would be helpful if someone could provide a test case to prove why what I'm thinking is wrong. Thanks.

  • Проголосовать: нравится
  • 0
  • Проголосовать: не нравится

»
5 лет назад, # |
  Проголосовать: нравится 0 Проголосовать: не нравится

Consider the following test case:

5 5
1 1 1 1 2

Even though the number of ones is more than 2 times the number of twos, it's still optimal to include that 2 in the final array. Under your assumption, it would be better to leave out the 2, instead having another 1, and your program's answer is 1 1 1 1 1. Another test case that breaks is

4 4
1 1 1 2

where your program outputs 1 2 1 2 due to integer division causing your variable tar to be 1.