For every i in range [l, r], if c i is c 1 then change it into c 2...
Because n, m are all very small, O(nm) can easily pass it.
PS. You can use binary search tree to solve it in time.
The k-th smallest zcy number is conn(str(k), rev(str(k))), where str denotes the decimal representation of a positive integer as a string, conn denotes the concatenation two strings, and rev denotes the reverse of a string.
Then go over the smallest k such numbers and sum them up to obtain the answer.
f(n) = str 1 + f(n - 1) + str 2 + f(n - 1) + str 3.
First we can compute the length of f(n) for all possible n.
For a pair of (n, k), we can easily determine which part the k-th character is in.
If it's in f(n - 1), we can solve the problem recursively.
The complexity of this algorithm is O(n), which is sufficient to pass all tests.
Obviously, length(f(n)) ≥ length(f(n - 1))·2, so length(f(60)) ≥ k max.
It means that for all n > 60, the k-th character of f(n) can only be in str 1 or the first f(n - 1).
Then we can answer a query in time.
As the initial sheet "has already" in a non-decreasing order (although it has no numbers), what we should do is just "maintain" this order.
We use a simple method to do so: find the first sheet whose number is strictly greater than the given number (or it's an empty sheet) and replace it with the new number.
For each round, we either replace an existing number with a strictly smaller one, or fill in an empty sheet. The first case will happen at most c - 1 times for each sheet, and the second case will happen only once for each sheet. Thus in total, we will modify a sheet for at most c times. Thus, the total rounds won't be more than n × c.
To pass all the tests, we only need to maintain 2 similar sequences, one non-decreasing from the first and one non-increasing from the last, which makes a total round of , precisely, and use binary search or brute force to complete the "finding" process.
This is an interesting algorithm which can easily deal with many data structure problems------if the data is random...
I initially named it as "Old Driver Tree" ( Which is my codeforces ID ).
(But now I call it Chtholly Tree~).
We can find that there is an operation that makes a range of number the same.
We can use an interval tree (std::set is enough) to maintain every interval that consists of the same number.
And for operation 2, we destory all the intervals in range [l, r] , and put in a new interval [l, r] into the interval tree.
For operations 1, 3, 4, we can brute-forcely walk on the tree, find every interval in range [l, r], and do the required operation on it.
Proof of time complexity:
We suppose that we have a randomly selected range [l, r] now, and we randomly choose which operation it is, suppose that there are x intervals in this range.
1/4 possibility we use O(x) time to erase O(x) nodes.
2/4 possibility we use O(x) time to erase nothing.
1/4 possibility we use O(x) time to erase nothing and add 2 new nodes into the tree.
So we are expected to use O(x) time to erase O(x) nodes.
By using interval tree to maintain, the time complexity of this problem is .
If operation 3 and 4 are changed into output the sum of a i for every i range [l, r], it seems that the time complexity may change into , but I do not know how to prove it...
First let's consider a simpler problem that there are no customers with VIP cards and there are no 50- yuan notes left. For convinence, we suppose that n is an even number. The situation that n is an odd number will be similar.
By defining points (number of customers currently, number of 50- yuan note left) on a 2d-plane, the answer to our second question is the ways of drawing lines from (0,0) to (n,0), such that two adjacent points' y-axis have a difference of 1, and that all the points are above the x-axis.
The total routes will be C n n / 2, but some of them are invalid. Consider another route starting from (0,-2).
For each invalid way in the previous route, consider the first point (x,y) that y<0 (y=-1).
By creating a symmetry route with y=-1 for the route before this point, this route will become exactly one route starting from (0,-2), and every route starting from (0,-2) will become an invalid route in a similar way.
So the number of invalid routes is C n n / 2 - 1 (that is the number of routes from (0,-2) to (n,0)). Thus the answer will be C n n / 2 - C n n / 2 - 1.
Similarly if there are [l,r] 50- yuan notes left, the answer will be C n n / 2 - r / 2 - C n n / 2 - l / 2 - 1.
Now let's enumerate how many customers are there with VIP cards. If there are i of them, the answer will time a factor C n i.
One last question is about the modulo number. First separate it into forms like (p 1 a 1) * (p 2 a 2)... where p 1... are primes. We can calculate how many factor p i are there in (j!), and the modulo value of the remaining ones.
Each time we take out a facter p i in (j!), and it becomes some product of numbers that are not divisble by p i as well as a remaining part (j / p i)!. For example, we want to calculate the number of factor 3 in (16!), and the product of numbers that are not divisble by 3 in (16!) mod (3^2). Then we have:
16! = (1 * 2 * 4 * 5 * 7 * 8 * 10 * 11 * 13 * 14 * 16) * (1 * 2 * 3 * 4 * 5) * (3^5)
The first part are not divisble by 3, so we can calculate their value (mod 3^2) in advance, the second part is a smaller problem (5!), so we can solve it recursively. For the number of factor 3, just add 5 in this case and solve it recursively.
After calculating how many factor p i in (j!) and the modulo value of the remaining ones, we can calculate the combnation numbers correctly. Finally use Chinese Remainder Algorithm to combine them.
I'm really sorry for letting the brute force algorithm pass the tests...
My code uses about 600ms, in order to let some algorithms with large constant or larger time complexity ( like ) pass, I set the time limit to 3000ms.
The most naive brute forces uses about 8000ms to 9000ms, I added some tricks and the fastest can pass the tests in 5600ms.
In all the contests I've attended, pragma GCC was not allowed to be used...
But on codeforces, this can optimize brute force algorithm from 5600ms to about 2500ms...
My solution to this problem:
Split the array into blocks, each containing numbers.
In each block, for example block x, use f[x][v] to represent the number of v in block x.
For each number i, belong[i] is the the block that i is in.
We need to maintain each number in the block.
This can be maintained by using DSU or linked list.
By maintaining this, we can get the value of every number in a block in time.
Notice that this two operations are the same:
1. For every number that is bigger than x, decrease it by x
2. Decrease every number by x, and for every number that is less than 1, increase it by x
For operation 1:
We get the value of each number in block belong[l] and belong[r] using the DSU or linked list, then for every number that should change, we change them.
Then we build block belong[l] and belong[r] up again.
For blocks numbered from belong[l] + 1 to belong[r] - 1:
If x × 2 ≤ max value in block p We merge all the numbers in range [1, x] to [x + 1, x × 2], and add x to tag[p] , tag[p] means that all the numbers in block p has decreased by tag[p].
If x × 2 max value in block p We merge all the numbers in range [x + 1, maxvalue] to [1, maxvalue - x].
For operation 2:
We get the value of each number in block belong[l] and belong[r] using the DSU or linked list.
We only need to traverse all the numbers in blocks belong[l] and belong[r], and traverse all the blocks between belong[l] and belong[r].
For block i in range [l, r], f[i][x + tag[i]] is the number of x in block i, so we just need to add this into the answer
Proof of time complexity:
There are blocks.
The difference between the max number and the min number in each block is initially n. So the sum of this in every block is .
For each operation 1, we use O(x) time or O(max - x) time to make the difference of max and min element O(x) or O(max - x) smaller.
For each operation 2, we traverse numbers and blocks.
So the total time complexity if
There seems to be another algorithm with the same time complexity, and has a smaller constant, but I couldn't prove its complexity so I used this algorithm instead.
(The missing picture on Div. 1E)
If you don't like it, just skip it please.