How setprecision Works on Division?

Правка en2, от fonmagnus, 2022-09-26 03:12:24

Last night I worked on this problem on Codeforces round 823 1730B - Meeting on the Line

1. Wrong answer on test 3 — 173469198
2. Accepted — 173524856

The accepted one using a setprecision function while the wrong answer one is not

After I take a look at the cause, then it stumbles upon this following message from checker : wrong answer 36th numbers differ - expected: '40759558.0000000', found: '40759600.0000000', error = '0.0000010'

How come the difference becomes very large? I know floating point has its own "weaknesses" for handling precision and stuff, but how come the difference of using "setprecision" and not using them produce a very different outcome?

Appreciate for the answers because I'm curious. Thanks!

#### История

Правки

Rev. Язык Кто Когда Δ Комментарий
en2 fonmagnus 2022-09-26 03:12:24 10
en1 fonmagnus 2022-09-26 03:09:20 841 Initial revision (published)