A C++ question

Revision en2, by andr1y, 2021-09-05 19:36:54

On today's CF Round #742, I submitted my code to the problem E, and suddenly got "Compilation Error", compile log informed me this thing:

Compiled file is too large [47861470 bytes], but maximal allowed size is 33554432 bytes [CompileRequest {id='program.cpp', description='', file='program.cpp', resources='', type='cpp.msys2-mingw64-9-g++17'}].

Submission: 127958200

I thought this is a system's bug, and resubmitted the code, and I got the same error: 127958576. After that, I saw that my binary file really is too large — about 44 Mb. I tried different things like getting rid of useless libraries or enabling optimization like Ofast/O3 etc. But none of this helped. After that, I wanted to replace all long longs except t_obj.ans with ints, but only separated ans from other variables and made tiny change ll ans=0; -> ll ans;, and compiled. It was really strange, but the binary file size became 36 Kb. To make me safe, I removed =0 from almost all variables and made a constructor, which makes them zero, and submitted that. It compiled well and passed pretests — 127960252.

So, the question is: how =0 near the variables can affect binary file size? This works only on GNU-based compilers? Can this behaviour be predicted, and can it be forced? Can I in such way reduce work time by increasing compile time?

Sorry for my bad English, thanks for any answer.

Tags c++

History

 
 
 
 
Revisions
 
 
  Rev. Lang. By When Δ Comment
en2 English andr1y 2021-09-05 19:36:54 0 (published)
en1 English andr1y 2021-09-05 19:32:56 1470 Initial revision (saved to drafts)