Agent_Lemon's blog

By Agent_Lemon, history, 2 years ago, In English

You are given an array of N numbers. The game goes like this. You have to select any two elements P and Q from the array. The cost of selecting these two elements will be (P | Q — P & Q) where "|" denotes the bitwise OR of two numbers and "&" denotes the bitwise AND of two numbers. Now out of these two numbers, throw one number and put the other number back into the array.

This way the size of the array will get reduced by one element. The game stops when there is only one element left in the array. The total cost of reducing the array to a single element is the sum of the costs incurred in all the steps.

Now, your task is to reduce the array to a single element while minimizing the total cost incurred.

Constraints:

1 <= N <= 10^3

1 <= A[i] <= 10^3

  • Vote: I like it
  • +13
  • Vote: I do not like it

| Write comment?
»
2 years ago, # |
Rev. 2   Vote: I like it +57 Vote: I do not like it

Make a complete graph where the cost of the edge between $$$u$$$ and $$$v$$$ is the cost as you wrote in the starement. The answer is the cost of minimum spanning tree of the graph.

»
2 years ago, # |
  Vote: I like it 0 Vote: I do not like it

I like the tag 'hevi problem'(´∀` )

»
2 years ago, # |
  Vote: I like it +3 Vote: I do not like it

I'm not sure why they gave the cost in that way, it's the xor between P and Q — p|q contains each power of 2 that appears in any of the two numbers, p&q contains those that appear in both. When subtracting, you are left with only those that appear in one of the two


Lazy 'proof' with z3
»
2 years ago, # |
Rev. 2   Vote: I like it +12 Vote: I do not like it

This is a bit overkill for the given constraints, but if M is the maximum value in A, the problem can be solved in O(N*log^2(M)) using divide-and-conquer.

If A has fewer than two unique elements, the answer is 0. Otherwise, find the highest bit where any two elements of A differ. Then, divide A into two arrays B and C, where B contains all elements of A with a 0 in that bit and C contains all elements of A with a 1. Solve the problem recursively on those two arrays. Then, construct a bitwise trie out of B and, for each element c of C, find the element b of B minimizing b XOR c. This can be done by searching for b in the trie, going down the opposite branch from each node where b's branch does not exist. The answer to the problem can be calculated as the sum of the two subproblems' solutions plus the minimum value of b ^ c plus a number with only the current bit set. Each level of the divide-and-conquer takes O(N*log(M)). There are O(log(M)) levels, meaning the overall complexity is O(N*log^2(M)).

As other comments pointed out, the problem comes down to finding the minimum spanning tree cost in a complete graph of elements of A, where edge weights are the bitwise XOR of their nodes. Note that the minimum spanning tree can be constructed by considering edges in order of nondescending weight. Since the cost of each edge is the XOR of its vertices, edges connecting two vertices that share the first k bits will be considered before edges connecting two vertices that differ anywhere in these k bits. Thus, we can simply find the MST for each subset based on its value in the first bit where any element differs and then connect these two subtrees optimally with a single edge.