karanmishra8997's blog

By karanmishra8997, history, 6 years ago, In English

I have an array A of N elements (1 <= N <= 200000). Now I have Q (1 <= Q <= 200000) queries. In each query, 3 integers x, l and r are given. I have to find min{x*j + A[j]} where l <= j <= r. I was thinking about solving this by segment tree or dynamic programming but still I can't handle proper time complexity. Example: Input Array A = [12, 13, 33, 25, 10, 41, 51, 2]. Query: 2 1 5 Output: 14 Query: 5 5 8 Output: 35. Can anybody tell me the right approach?

Full text and comments »

  • Vote: I like it
  • -11
  • Vote: I do not like it

By karanmishra8997, history, 6 years ago, In English

Suppose I have n nodes and Q queries are given. Each query can be one of the following types: 1) 1 x y: make edge between point x and y. 2) 2 x y: Find distance between x and y (number of edges in the path from x to y). 3) 3 x: Number of nodes of the component in which node x resides. It is guaranteed that after making edge between x and y, tree structure will be as it is i.e. there will be no query such that it makes a cycle. Constarints: 1 <= N,Q <= 100000 I know that we can join x and y and make an edge by Disjoint Set Union technique. By using that, we can also store number of nodes in each component. But I don't know how to solve query (2). Can anybody help?

Full text and comments »

  • Vote: I like it
  • +1
  • Vote: I do not like it