Bengal_Tiger's blog

By Bengal_Tiger, 5 years ago, In English

I have solved the popular xor maximization problem XMAX — XOR Maximization using gaussian elimination . [problem statement : Given a set of integers S = { a1, a2, a3, ... a|S| }, we define a function X on S as follows: X( S ) = a1 ^ a2 ^ a3 ^ ... ^ a|S|. Given a set of N integers, compute the maximum of the X-function over all the subsets of the given starting set.]

But when applying same code in XOR with Subset from codechef, it gets WA.

[**problem statement** : You have an array of integers A1, A2, ..., AN. The function F(P), where P is a subset of A, is defined as the XOR (represented by the symbol ⊕) of all the integers present in the subset. If P is empty, then F(P)=0.

Given an integer K, what is the maximum value of K ⊕ F(P), over all possible subsets P of A? ]

the only change i made is initialize answer = k. than after using gaussian elimination , i greedily checked if taking it increase answer.

My code : Link

can anyone please tell me whats getting wrong.

Full text and comments »

  • Vote: I like it
  • +5
  • Vote: I do not like it