Question -> https://codeforces.com/problemset/problem/264/A
My Code -> https://pastebin.com/DpxZ8wEp
# | User | Rating |
---|---|---|
1 | tourist | 3843 |
2 | jiangly | 3705 |
3 | Benq | 3628 |
4 | orzdevinwang | 3571 |
5 | Geothermal | 3569 |
5 | cnnfls_csy | 3569 |
7 | jqdai0815 | 3530 |
8 | ecnerwala | 3499 |
9 | gyh20 | 3447 |
10 | Rebelz | 3409 |
# | User | Contrib. |
---|---|---|
1 | maomao90 | 171 |
2 | awoo | 164 |
3 | adamant | 162 |
4 | TheScrasse | 159 |
5 | nor | 153 |
5 | maroonrk | 153 |
7 | -is-this-fft- | 152 |
8 | Petr | 146 |
9 | orz | 145 |
10 | pajenegod | 144 |
Question -> https://codeforces.com/problemset/problem/264/A
My Code -> https://pastebin.com/DpxZ8wEp
Name |
---|
Precision problem, "long double" data type support a limited accuracy.
When x,y are small enough, there is no guarantee on the inequality $$$L<(L+R)/2<R$$$ and compiler may gives you something like $$$L=(L+R)/2\leq R$$$. For instance all char in your problem are 'l' and your position are $$$1/2,1/4...,1/2^n$$$. Can the computer differentiate $$$1/2^{999990}$$$ and $$$1/2^{999991}$$$ properly? (In fact it treats $$$1/2^{16446}$$$ as $$$0$$$)