hari6988's blog

By hari6988, 13 years ago, In English

Hi,

Can someone tell me on what basis you decide that a naive solution will cause time exceeded error ? I have faced this problem quite a few times.

  • Vote: I like it
  • +8
  • Vote: I do not like it

| Write comment?
13 years ago, # |
Rev. 2   Vote: I like it +6 Vote: I do not like it

You should know that an average cheking system can do(should I use do or make verb in this case?) about 10000000(10^7) simple operations per second.(For example, my PC can add 30000000 integers to each other in a second). So if you want to do something like this:
for (int i = 0; i < 100000; i++) for (int j = 0; j < 100000; j++) ....//do smth
it will definetely get TL(10^10 operations).
But if you want to sort an array of 100000 elements(100000 * log2(1000000) < 10^7), everything will be ok.
13 years ago, # |
  Vote: I like it 0 Vote: I do not like it
The main method for this is formal complexity estimation, the one with O() notation. There are also smaller implementaion factors affecting performance, like cache efficiency, but the time limits are usually not very strict and asymptotically good solution is usually enough.
13 years ago, # |
  Vote: I like it 0 Vote: I do not like it
You can also measure the time your program takes to run on the worst case(s). Usually try the big ones.