Need help on this problem!

Правка en1, от adsijf, 2021-05-18 10:57:35

Given two arrays a and b, both arrays have size n. Find m the length of longest subsequence of i (1 <= i1 <= i2 <= i3 <= im) such that max(a[i1],a[i2],...,a[im]) + sum(b[i1],b[i2],...,b[im]) <= S , with S is given.

Constraints : n <= 10^5 , 1 <= a[i],b[i] <= 10^9

S <= 10^18

a[i] < a[i+1] for every 1 <= i <= n-1 (array a is increasing)

Input : First line : n,S

The following n lines : a[i] b[i]

An example test: Input:

4 10

1 5

2 1

3 3

4 2

Output :

3

Explain: Chosen subsequence is (2,3,4) (1-indexed).

История

 
 
 
 
Правки
 
 
  Rev. Язык Кто Когда Δ Комментарий
en1 Английский adsijf 2021-05-18 10:57:35 628 Initial revision (published)