### aarcee's blog

By aarcee, history, 2 years ago, #### Brief Explanation:

• Given n segments with their left and right end points on OX-axis.
• You are given q queries where each query starts with an integer m followed by m integers(points).
• The answer to the query is the number of segments, such that each of them contains at least one point from the query.

#### My Approach:

• Sort the given points in the query.
• Start from the leftmost point
• For the first point find the number of segments which contain this point and add to the sum.(Finding the number of segments which contain this point or not can be done with binary search)
• For the second point we do the same as done for the first point but subtract the number of segments which contain the segment [first, second] as they are already counted for the first point.
• Similarly for the third point we find the number of segments which contain this point and subtract the number of segments which contain the segment [second, third] as they are already counted for the first and second points.
• Similarly for the ith point we find the number of segments which contain this point and subtract the number of segments which contain the segment [i — 1, i].

But I am not able to figure out a way to find the number of segments out of given n segments which contain a specific range(segment) from [L, R] in logn time.Is there a way to find it out??

Please say me if some part of my approach seems wrong or if you have some different solution.