# Multiplication's Complexity

Vincent Diepeveen diep at xs4all.nl
Tue Mar 4 01:48:00 UTC 2014

```
On Mon, 3 Mar 2014, Farnam Mansouri wrote:

> Hi,
>
> I have a question regarding to the complexity of the FFT algorithm in
> multiplication.
>
> I couldn't quite understand the implemented algorithm and its complexity
> based on the explanations on the page 96 of the manual (or
> https://gmplib.org/manual/FFT-Multiplication.html#FFT-Multiplication). It
> is mentioned that the Schonhage-&-Strassen algorithm is being used; but the
> complexity of the algorithm is O(N^(k/(k-1))), whereas in other
> literatures, it can be found that the complexity of the
> Schonhage-&-Strassen algorithm is in the order of O(N log N log log N). I
> know that there are multiple level of recursion for integer multiplication
> (for point-wise multiplication) but how can the complexity of the algorithm
> be like that (without any factor of "log" which I beleive every FFT-based
> algorithm should have it somehow)?
>
> Can someone please explain that to me.

You can rewrite O(N log n log log n) to O(N^x) as well.
Note that soon you're busy (theta) 2n log 2n log log 2n in practice,
with n being the number of limbs. Usually a bit or 16 get stored
in a limb which usually is a double.

The more optimized codes like Woltman (not using the above transform)
stores a bit or 18 a double.

> P.S. I am using GMP integer multiplication (for extremely large integers)
> in some of my algorithms. As a result the complexity of my algorithm
> depends on the complexity of GMP's FFT multiplication.

Tried Woltman's library already? It's a lot faster...