Optimal Algorithm on Polynomial Composite Set Power Series

Правка en1, от Elegia, 2021-06-25 13:24:59

This is a part of my report on China Team Selection 2021. I might translate some other parts later if you are interested.

I'm going to show that we can calculate $$$f(G)$$$ for any polynomial $$$f\in R[x]$$$ and set power series $$$G: 2^{n} \rightarrow R$$$ under $$$\Theta(n^2 2^n)$$$ operations on $$$R$$$.

Let's first consider a special case: calculate $$$F = \exp G$$$. We consider set power series as truncated multivariate polynomial $$$R[x_1,\dots,x_n]/(x_1^2,\dots,x_n^2)$$$. We can see that $$$F' = FG'$$$ whichever partial difference $$$\frac{\partial}{\partial x_k}$$$ we take. Then we can rewrite this equation as $$$[x_n^1] F = ([x_n^1]G)\cdot [x_n^0]F$$$. Hence we reduced the problem into calculating $$$[x_n^0] F$$$, and then one subset convolution gives the rest part. The time complexity is $$$T(n)=\Theta(n^22^n) + T(n-1)$$$, which is exactly $$$\Theta(n^22^n)$$$. I call this method "Point-wise Newton Iteration".

This method sounds more reasonable than calculating the $$$\exp$$$ on each rank vector. Because $$$\frac{G^k}{k!}$$$ counts all "partitions" with $$$k$$$ nonempty sets in $$$G$$$. So it exists whenever $$$1\sim n$$$ is invertible in $$$R$$$.

Now we try to apply this kind of iteration into arbitrary polynomial $$$f$$$. Let $$$F=f(G)$$$, then $$$F'=f'(G)G'$$$. Then we reduced the problem into calculating $$$[x_n^0]f(G)$$$ and $$$[x_n^0]f'(G)$$$. This becomes two problems! But don't worry. In the second round of recursion, it becomes $$$[x_{n-1}^0 x_n^0]$$$ of $$$f(G), f'(G), f"(G)$$$. The number of problem grows linearly. You will find out that the complexity is $$$\sum_{0\le k\le n} (n-k) \cdot \Theta(k^2 2^k)$$$.

It's not hard to see that the complexity is still $$$\Theta(n^2 2^n)$$$, because $$$\sum_{k\ge 0} k\cdot 2^{-k}$$$ converges.

Noted that $$$\frac{(A+B)^2}2-\frac{A^2}2-\frac{B^2}2=AB$$$, we proved the equivalence between the composition and subset convolution.

Here are some remarks:

  • If $$$G$$$ has no constant term, $$$f$$$ is better to be comprehended as an EGF.
  • This algorithm holds the same complexity on calculating $$$f(A, B)$$$ and so on.
  • We can optimize the constant of this algorithm by always using the rank vectors during the iteration. Then it can beat lots of methods that work for some specialized $$$f$$$.
  • If you have some knowledge of the "Transposition Principle", you will find out that, for any fixed $$$F, G: 2^n \rightarrow R$$$, the transposed algorithm gives $$$\sum_S F_S \cdot (G^k)_S$$$ for all $$$k$$$. It immediately gives an $$$\Theta(n^22^n)$$$ algorithm to calculate the whole chromatic polynomial of a graph with $$$n$$$ vertices.

История

 
 
 
 
Правки
 
 
  Rev. Язык Кто Когда Δ Комментарий
en1 Английский Elegia 2021-06-25 13:24:59 2700 Initial revision (published)