Polynomial Multiplication over Finite Fields in Time \( O(n \log n \)

Abstract
Assuming a widely believed hypothesis concerning the least prime in an arithmetic progression, we show that polynomials of degree less than \( n \) over a finite field \( \mathbb {F}_q \) with \( q \) elements can be multiplied in time \( O (n \log q \log (n \log q)) \), uniformly in \( q \). Under the same hypothesis, we show how to multiply two \( n \)-bit integers in time \( O (n \log n) \); this algorithm is somewhat simpler than the unconditional algorithm from the companion paper [22]. Our results hold in the Turing machine model with a finite number of tapes.

This publication has 38 references indexed in Scilit: