Блог пользователя Bengal_Tiger

Автор Bengal_Tiger, 5 лет назад, По-английски

I have solved the popular xor maximization problem XMAX — XOR Maximization using gaussian elimination . [problem statement : Given a set of integers S = { a1, a2, a3, ... a|S| }, we define a function X on S as follows: X( S ) = a1 ^ a2 ^ a3 ^ ... ^ a|S|. Given a set of N integers, compute the maximum of the X-function over all the subsets of the given starting set.]

But when applying same code in XOR with Subset from codechef, it gets WA.

[**problem statement** : You have an array of integers A1, A2, ..., AN. The function F(P), where P is a subset of A, is defined as the XOR (represented by the symbol ⊕) of all the integers present in the subset. If P is empty, then F(P)=0.

Given an integer K, what is the maximum value of K ⊕ F(P), over all possible subsets P of A? ]

the only change i made is initialize answer = k. than after using gaussian elimination , i greedily checked if taking it increase answer.

My code : Link

can anyone please tell me whats getting wrong.

  • Проголосовать: нравится
  • +5
  • Проголосовать: не нравится

»
5 лет назад, # |
Rev. 4   Проголосовать: нравится -6 Проголосовать: не нравится

[Deleted]

»
5 лет назад, # |
  Проголосовать: нравится +30 Проголосовать: не нравится

Hey, I read your code for like 15 minutes and it seemed perfectly OK for me, then I noticed a tinny little silly mistake :

in the 34th line you have a parentheses problem, instead of x^ans > ans you have to change it to (x^ans) > ans .

Fixing that will give you AC.

PS : you could code the Gaussian Elimination without building the actual matrix (simply by applying bit operators on the array), Good Luck.