*Hello friends, this is my first ever blog on Code forces. Few days back, in my Design of Algorithms lecture, Fast Fourier transform was taught and I was having some difficulty grasping it. I have learnt it now from various sources like CLRS and Emaxx and thought to make it easy for others as well. :)* *Here I present to you a very fundamental approach to understanding what fast Fourier transform does.*

Let us try to multiply 2 polynomials of degree n, what is its time complexity if we multiply the polynomials as we did in high school? We would have to multiply every coefficient corresponding to every power of x. This Naive Approach, as we can see takes O(n^2) time. Now we ask ourselves the question that underlies the Design And Analysis of Algorithms, **Can We do better?**

As it turns out, we can. Using the Fast Fourier transform, we can do this task and bring back the complexity to O(n*log(n)) complexity. But, that is far now as we shall now see the Discrete Fourier Transform, the basis for FFT. *Well, FFT is just an optimization of DFT in which we store previously calculated values and use them to compute the current values. Going too far, am i not? I must restrict the discussion of FFT her and concentrate on DFT for the time being.*

**What is DFT?** - Before answering this, just think, can we retrieve a polynomial y = P(x)of degree n if we know values of y for (n+1) values of x ? Yes certainly we can, because we get n+1 distinct linear equations and that many number of unknown, we can retrieve the value. - So, suppose I need C(x) = A(x)*B(x), I need the values of A(x) and B(x) at 2*n+1 points to calculate all the coefficients of C, which is the result. Now, How much time does this take? If we do it optimally using HONOR's rule, it takes us (2*n+1) computations for computing values for each of C,A,B. And each computation takes O(n) time if we compute it optimally by applying HONOR's rule, which goes like this: *Suppose, I have to calculate the value of 10+6*x+6*(x^2)+3*(x^3), I will compute 3*x then add 6 to it. which makes it (3*x+6) and then multiplying it with x, we get 3*x^2+6*x. I think you guys have guessed where this is going? See how small changes in this make life so much easier!!! :)* - As I have already mentioned, there are 2*n+1 maximum computations with each computation taking n computations in itself. This takes our complexity to O(N^2), which is as worse as our naive algorithm, but what DFT has done here is that it has provided us with an idea of how to get to that O(N*log(N)) solution, How you ask, keep reading, and we will see. :)

**Now Let us move to Fast Fourier Transform** Here I think all have prerequisite knowledge of complex numbers and how they are represented as vectors on the plane. If not, please revise it here. Here, I am going to provide one very important property of the nth roots of unity. which is : click here. In case, you had n=any problem interpreting that messy handwriting of mine, it say that

# (w(n,k+n/2))^2 = w(n,k)^2 where w(n,k) represent nth root of unity raised to k powers.

So, what does this do? We calculate the transformation values from 0 up to n/2 and the next n/2 are pre-computed using the above property. How do we go about business here?

WE can divide the polynomial into two different parts , one containing even powers(let us denote it by a0(x^2)) and the other containing odd powers(let us denote it by x*a1(x^2)). Now, we use divide and conquer to compute the transform of these values and then compute the above total expression as a0 + x*a1. This gives us the fast Fourier transform of the polynomial.

Now, we compute the multiplication of FFT of both polynomials and calculate the inverse FFT of this result to get the resultant polynomial.

**So, How much time is taken in calculating the FFT and Inverse FFT?** We compute the values of half the elements every time and the rest is stored for the second half. And the rest computation takes O(n) time to compute the values of the polynomials in the second half, which takes our recurrence reation to , T(n) = 2*T(n/2) + O(n). Seen this somewhere? yes, the recurrence relation for merge sort and hence time complexity must be O(n*log(N)), which can also be derived from the Master Theorem.

I have written the detailed code for this and it can be found here. We will discuss the part of calculating inverse FFT in the next article as you might get bored reading such a large article. Until next time, Adios Amigos