hackerblack's blog

By hackerblack, 5 weeks ago, In English

Can someone please tell me why this submission 105471957 of Longest Palindrome Problem 1304B — 22 is giving runtime error on test #3? Thanks.

One thing I have noticed is that removing the #define int long long int line in the code gives accepted verdict. This can be seen in this submission 105471434

 
 
 
 
  • Vote: I like it
  • -2
  • Vote: I do not like it

»
5 weeks ago, # |
  Vote: I like it 0 Vote: I do not like it

Auto comment: topic has been updated by hackerblack (previous revision, new revision, compare).

»
5 weeks ago, # |
Rev. 2   Vote: I like it +8 Vote: I do not like it

The problematic line is for(int i=y.size()-1;i>=0;i--){. If y is empty, then y.size() returns 0, but that zero is unsigned, so 0-1 is a huge all-ones number. As you're using 'GNU C++17', your compiler is 32-bit, so 0-1 is basically 32 bits of ones, i.e. $$$2^{32} - 1$$$.

Now, if you use #define int long long int, this 32-bit unsigned integer gets extended to 64 bits and you have $$$i = 2^{32} - 1$$$ which is of course larger than zero so the loop goes on and you access the array at index $$$2^{32} - 1$$$.

If you don't use #define int long long int, this 32-bit unsigned integer is casted to a 32-bit signed integer, so $$$2^{32} - 1$$$ becomes $$$-1$$$ which is less than zero so the loop stops.

Solution: replace y.size() - 1 with (int)y.size() - 1.