Can't find maximum X1 + X2 using Gaussian Elimination in D. Two Sets.

Правка en1, от _Muhammad, 2019-01-29 14:38:35

In this problem I am trying to find maximum X1 + X2. Lets X is the Xor of all given numbers. I assumed each number as a vector and then found the basis.Let see an example :

5 = 1 0 1

6 = 1 1 0

7 = 1 1 1

After using Gaussian elimination the basis are :

1 0 0

0 1 0

0 0 1

And then I iterated from most significant bit and If there is 0 in i -th bit of X then I tried to put 1 in X1. If it is not possible then I skipped that bit and went to next bit. But this process is not giving me maximum X1+X2. Here is the Implementation. Please can anyone tell me what is wrong here.Is my logic is wrong?? Or the Implementation is not prefect?

Please help me.I am stucked here.

История

 
 
 
 
Правки
 
 
  Rev. Язык Кто Когда Δ Комментарий
en1 Английский _Muhammad 2019-01-29 14:38:35 929 Initial revision (published)