Can someone please tell me why this submission 105471957 of Longest Palindrome Problem 1304B — 22 is giving runtime error on test #3? Thanks.

One thing I have noticed is that removing the **#define int long long int** line in the code gives accepted verdict. This can be seen in this submission 105471434

Auto comment: topic has been updated by hackerblack (previous revision, new revision, compare).The problematic line is

`for(int i=y.size()-1;i>=0;i--){`

. If`y`

is empty, then`y.size()`

returns 0, but that zero is unsigned, so`0-1`

is a huge all-ones number. As you're using 'GNU C++17', your compiler is 32-bit, so`0-1`

is basically 32 bits of ones, i.e. $$$2^{32} - 1$$$.Now, if you use

`#define int long long int`

, this 32-bit unsigned integer gets extended to 64 bits and you have $$$i = 2^{32} - 1$$$ which is of course larger than zero so the loop goes on and you access the array at index $$$2^{32} - 1$$$.If you don't use

`#define int long long int`

, this 32-bit unsigned integer is casted to a 32-bit signed integer, so $$$2^{32} - 1$$$ becomes $$$-1$$$ which is less than zero so the loop stops.Solution: replace

`y.size() - 1`

with`(int)y.size() - 1`

.Thank you so much for the explanation. Now it is clear.