I refer you to C Lumberjack in EC 2012 (Link: http://ch24.org/static/archive/2012/2012_ec.pdf).
I have a question regarding this problem. In the problem statement, it is stated that the order in which trees are cut may matter. However, I am unable to find a example if which this is the case. Could someone help me understand why this is so or give me a small example that I can use to understand the situation?
Also, I have an alternative solution that gives wrong answer for large testcases, but I am unable to figure how why it fails. The solution is as follows.
Let L[x] denote the leftmost tree that will fall if tree x is pushed to the left.
Let R[x] denote the rightmost tree that will fall if tree x is pushed to the right.
Let F[x] denote the minimum number of cuts needed to cut down EXACTLY all the trees from 1...x (this means that cutting down the trees from 1...x must not let other trees fall)
Firstly, we initialise F[x] to infinity. Afterwards, we run a for loop for i from 1 to N, and we do the following updates.
F[i] = min(F[i], 1 + F[L[i]-1]); //If tree i is pushed to the left
F[R[i]] = min(F[R[i]], 1 + F[i-1]); //If tree i is pushed to the right
I think an assumption implicit in this DP solution is that the ranges of the trees I pushed down do not overlap. Therefore, the order in which the trees are pushed does not matter.
Thanks in advance for your help!