TrungNT's blog

By TrungNT, history, 3 years ago, In English

Hello everyone!

Today, I have encountered a problem, which can be reduced to this: Given a 2D $$$N \times M$$$ $$$(1 \leq N, M \leq 2000)$$$ matrix, answer $$$Q$$$ $$$(Q \leq 5000)$$$ queries, each query asks for the number of cells with distinct colors in a submatrix with top left cell $$$(x1, y1)$$$ and bottom right cell $$$(x2, y2)$$$. (The "static colored orthogonal range counting" problem).

There are no bounds to the number of colors, unlike a well-known Codechef problem.

The 1D problem, SPOJ DQUERY, can be easily solved using a Persistent Segment Tree, with $$$O(n log n)$$$ space and $$$O(log n)$$$ query time.

I have found some papers giving solutions with $$$O(n ^ 2 log ^ 2 n)$$$ preprocessing time and $$$O(log ^ 2 n)$$$ time for each query, however, the data structures there seems to be too advanced for me. It seems to be an upgrade of the data structure for the 1D problem, but it's something different (and much more complex) than a Persistent Segment Tree.

Are there any simpler algorithms that work in $$$O(n)$$$ or less for each query (which is enough for this problem)? Offline solutions might be good too, but I haven't thought of any (I've tried to use Mo's algorithm, but I don't know how).

Thanks for helping!

  • Vote: I like it
  • +14
  • Vote: I do not like it

»
3 years ago, # |
  Vote: I like it 0 Vote: I do not like it

if range of elements is quite small like upto 1000/2000 then you can use a bitset at every r,c and the answer to a query will be no. of set bits in bitwise OR of the submatrix which can be calculated using 2d segment tree if i am not wrong.

»
3 years ago, # |
  Vote: I like it +15 Vote: I do not like it

Assume that $$$n = q$$$. Just like doing Mo's algorithm on intervals (so, points in 2 dimensions) is $$$O(n^{1.5})$$$, you can do it on rectangles (points in 4 dimensions) to get $$$O(n^{1.75} \cdot X)$$$, assuming that you can move one coordinate of a rectangle in $$$O(X)$$$. This move requires updating the state of up to $$$N$$$ cells, each in $$$O(1)$$$ by maintaining frequency array. So we have the total complexity of $$$O(n^{2.75})$$$. Not the best complexity but well, it's sub-cubic.

To find a path through points for Mo's algorithm, think first how you would do it in two and three dimensions. You need to split rectangles into groups according to a tuple of all four coordinates, each divided by $$$n^{0.75}$$$, floored.

I hope I didn't say something stupid. I never really did 4-dimensional Mo.

»
3 years ago, # |
Rev. 2   Vote: I like it +4 Vote: I do not like it

Let's assume $$$M = N$$$. And let's do Mo's algorithm on columns with some data structure. Every time we move our pointers, we just add/remove all elements of the columns to/from our data structure. Doing so gives us the complexity of $$$O(NM \sqrt N \cdot X + Q \cdot Y)$$$, where $$$X$$$ is the time we update our data structure for a single element, and $$$Y$$$ is our query time.

Here is one approach that has $$$X = O(log^2(n))$$$ and $$$Y = O(log^2(n))$$$. Let's see how adding/removing an element change our result. Suppose we are adding element $$$val$$$ at the row $$$x$$$, and in our data structure, we already have that element at the rows $$$r_1, r_2, ... r_t$$$, and let's call this set $$$R$$$. If row $$$x$$$ is already in $$$R$$$, we simply skip updating the result (and maybe change the counter for this row so in the future we may know when to remove it). If not, suppose $$$r_i$$$ and $$$r_{i + 1}$$$ is the previous and next element of $$$x$$$ in $$$R$$$. Before adding, all rectangle having lower rows from $$$r_i$$$ to $$$x$$$ and upper row from $$$x$$$ to $$$r_{i + 1}$$$ did not contain this element, but after this, they do, and only these rectangles having this row range change their answer. So we should update the answer for these rectangles (increase the answer by 1). And we can easily see that the updating looks like increasing value a submatrix, limited by the points $$$(r_i, x)$$$ and $$$(x, r_{i + 1})$$$. Hence we can use the 2D Fenwick tree for this kind of update, and the rest just points queries.

But that seems unequal for updating and querying. How about $$$X = O(1)$$$ and $$$Y = O(N \sqrt N)$$$? :)

Using the same idea as above, just changing the data structure. In our Fenwick tree solution, we must change the values of 4 points. Let's just use a plain $$$N \cdot M$$$ array, called $$$X$$$, to store these values of those points. Also for faster querying, I use a different 2D array $$$Y$$$ of size $$$\sqrt N \cdot \sqrt M$$$, each will store the sum of all elements of $$$X$$$ in each rectangle of size $$$\sqrt N \cdot \sqrt M$$$. This is obviously $$$O(1)$$$ for updating because we only update 1 element of $$$X$$$ and 1 of $$$Y$$$. For querying, we sum up all subrectangles inside our querying rectangle using the $$$Y$$$ array, and there is no more than $$$\sqrt N \cdot \sqrt M$$$ such subrectangles. There are also some elements at the border, we can add them by hand, and it can be shown that there is no more than $$$O((\sqrt N \cdot \sqrt M) \cdot \sqrt N)$$$ such elements.

I also hope that I didn't say anything stupid.

Edit: I forgot that we also need to find the next and previous element fast. If we just use set then $$$X$$$ is not $$$O(1)$$$, but $$$O(log(n))$$$. I'm still thinking about how to optimize this.