This post was asked in online coding round of a company.

There are n servers. Ith server has requests[i] requests to serve but can serve a maximum of max_req[i] requests.

In order to balance the load, a hackers reed to redirect some requests of Ith server to some other servers. The latency induced in redirecting the request on the Ith server to the Jth server is |i-j| where |x| represents the absolute value of x. The max_latency is defined as the maximum latency Induced amongst all the redirections.

Given the arrays requests and max_req find the minimum possible max latency if the requests are redirected optimally such that no server has to serve more than the maximum requests it can serve. If there is no way to serve all the requests, report -1 as the answer.

Auto comment: topic has been updated by randomrandom1810 (previous revision, new revision, compare).Interesting question. What is the expected TC? Not able to think anything in O(n)/O(nlogn). Waiting for others in the community to post the solution.

Note that the answer is $$$-1$$$ if and only if $$$\sum request[i]>\sum max\text{_}req[i]$$$; we assume that's not the case.

Observation: there exists an optimal configuration such that for all $$$a<b$$$, if a request from $$$a$$$ is redirected to $$$c$$$ and a request from $$$b$$$ is redirected to $$$d$$$, then $$$c\le d$$$.

Proof: otherwise, simply redirecting from $$$a$$$ to $$$d$$$ and from $$$b$$$ to $$$c$$$ instead does not increase the answer.

Now, we can binary search for the answer. Each binary search iteration can be handled in an $$$O(n)$$$ greedy algorithm where we push requests as far to the left as possible, iterating across servers from left to right.

Anyone having the implementation of this problem?