By adsijf, history, 5 weeks ago,

Given two arrays a and b, both arrays have size n. Find m the length of longest subsequence of i (1 <= i1 <= i2 <= i3 <= im) such that max(a[i1],a[i2],...,a[im]) + sum(b[i1],b[i2],...,b[im]) <= S , with S is given.

Constraints : n <= 10^5 , 1 <= a[i],b[i] <= 10^9

S <= 10^18

a[i] < a[i+1] for every 1 <= i <= n-1 (array a is increasing)

Input : First line : n,S

The following n lines : a[i] b[i]

An example test: Input:

4 10

1 5

2 1

3 3

4 2

Output :

3

Explain: Chosen subsequence is (2,3,4) (1-indexed).

• +3

»
5 weeks ago, # |
0

## TC : O ( n )

Let the answer be the longest subsequence starting with index "L" and ending with index "R"

It is given array A is increasing, Thus max ( a[l] , a[l+1], .... , a[r] ) = a[r]
the max part will be the rightmost index of the subsequence.

Now, The problem becomes ::

# sum(b[l],b[l+1],...b[r]) <= ( s — a[r] )

## name the above expression as req_sum

The given input structure hints that the question will be solved as we take inputs

IMAGINE THIS QUESTION AS IF A SNAKE IF MOVING IN A STRAIGHT LINE.

AT EVERY INDEX ITS LENGTH INCREASES (sum += b) .

IF ITS LENGTH INCREASES BEYOND A CERTAIN POINT ( req_sum ) , WE TRIM ITS TAIL ( sum -= b[l] ) TILL IT REACHES THE REQUIRED LENGTH.

AT EVERY INDEX WE RECORD THE LENGTH OF SNAKE ;)

## ******************************

n , s = input

B = [ ] (for storing array b)(no need for A)

sum_of_subsequence = 0

l = 0 , r = 0 (l , r are the makers of starting and ending index of sequence)

for ( i = 1 ; i <= n ; i++ )

a , b = input

req_sum = s - a

sum_of_subsequence += b

B.append(b)

r += 1

if sum_of_subsequence > req_sum :  (if sum exceeds required sum, we decrease it)

while ( sum_of_subsequence > req_sum ) :

sum_of_subsequence -= B[l]
l += 1

length_of_subsequence = r - l

if ( length_of_subsequence > answer ):