yashagarwal082's blog

By yashagarwal082, history, 9 years ago, In English

Suppose you have been given two arrays of length N ( 1 < N <= 100000), say A[1 .. N] & B[1 .. N], Now you have to output another array, C[1 .. N] such that:

C[ i ] = A[ i ] + min(C[ j ] — B[ j ] * B[ i ]) for all 0 <= j < i.

Note: C[0] = A[0]

for example: Suppose A = [3, 9, 4] and B = [2, 4, 2]

C[0] = A[0] = 3

C[1] = 9 + C[0] — B[1] * B[0] = 4

C[2] = 4 + min(C[0] — B[0] * B[2], C[1] — B[1] * B[2]) = 4 + min(-1, -4) = 0

Ranges :

0 < N < 100000

-10^9 <= A[i] <= 10^9

0 < B[i] < 10^7

Besides the brute force solution, I have thought of segment tree, But it will fail to make range minimum query.

I think some new data-structure will be used, which i have not known. Could somebody suggest the data structure for the same or how to problem the with it...

Thanks in Advance

Full text and comments »

  • Vote: I like it
  • -3
  • Vote: I do not like it