How to find kth largest element each time where k will vary and array is modifiable. That is you can add new elements to the array array can have duplicates.

for example say array is 10 , 20 , 15.

2nd largest = 15

add 17 to array

array becomes 10 15 17 20

3rd largest = 15

Q <= 10^5.

You can use PBDS

array can contain duplicates too, i tried it with pdbs but it work for unique elements.

use

`less_equal`

instead of`less`

.Thanks a lot. It worked.

Suppose we also want to delete a single element which occurs more than 1 time.

A.erase(x) will delete all x. how to erase just a single x value ?

1600

You can keep pairs $$$(a_i, id)$$$ in the ordered set ($$$less$$$), where $$$id$$$ is unique for each pair.

Ordered_multiset

Just change less to less_equal

Like this:

But it is giving wrong answer after I perform delete, and I am not able to delete single element from it.

You can use find_by_order to find kth element and add is just insert

yes, but after deletes it doesn't work correctly. See above link.

You need to write

Thanks.

proggerkz but there is one problem here.

suppose 1 , 3 , 3 , 5 , 7 , 9 , 11

3rd largest = 3

less than 9 = 5 elements

delete(3)

it becomes 1 , 3 , 5 , 7 , 9 , 11

3rd smallest = 5 , but it is showing 3. http://ideone.com/Oessur

No of elements less than 9 are 5 , but it should be 4

after deletion things are not working as it should be.

In the blog above you said about just adding and finding kth element, Ordered_set can't erase element x but it can erase k th element by doing A.erase(A.find_by_order(k))

i just wanted adding and finding kth element for a problem.

Rest I was just experimenting on myself. Sorry for trouble.

Thanks for help.

Try implementing it with fenwick tree. If all the values are less than 1e5 then it becomes relatively easy. If the values can go high then you need to store the queries and process the elements offline so that you can compress the higher values to smaller values.Every element will be compressed to some element <=1e5 as the no. of unique values <= 1e5. Now you can use binary search on fenwick tree !!

I think that square root decomposition can handle this. Just compress the numbers and make two tables with respect to the rank. One will be O(N) in size, the other O(sqrt(N)) in size. I leave it as a small exercise for you to fill in the rest of the details.

This solutions works even if you also have deletions from the array or you have point updates. But this cannot handle range updates.