# Pseudo prime tests

Torbjorn Granlund tg at gmplib.org
Mon Jun 11 12:48:14 CEST 2012

```I have read up on this subject.

Selfridge and Pomerance have noticed that an M-R test with base 2 plus
a Fibonacci test (Fib_n + Legendre(n,5) == 0 (mod n)) seem to correctly
identify composite numbers.  They will give USD 620 to anyone who
finds a composite identified as a prime, or proves that their
suggested test actually is true prime test..

(I think computing Fib_n mod n for the number n can be done at about the
same cost as a modexp operaton of log(n)-bit numbers.  We could use the
same basic algorithm we use today for computing Fibonacci numbers, ecept
that we need to stick mod operatoons here and there.  But we should
also perhaps use REDC residues.)

No counterexample exists under 10^17.  We could use this for a
one-argument mpz_pprime_p(n) test, if we carefully document the
properties.  Nobody can promise a specific likelihood that a number
passing this test is a prime (unless it is < 10^17, where this
likelihood is 1).

Furthermore, I am starting to understand problem numbers for M-R
tests.  Numbers of the form n = pq, where p,q are prime and q=kp-k+1,
or equivalently p-1 = k(q-1) will have more "false witnesses" than a
uniformly random n.

The smaller k, the more false witnesses; k=2 gives the worst case of
1/4.

Perhaps we could recognise all numbers with many false witnesses, and
then run much fewer M-R tests for a given probability?  This could speed
up things by perhaps a factor of 2 to 3, while still allowing us to
conservatively promise strict likelihood limits.

1. Do some trial dividing, return if factor found.
2. Run a millerrabin test of, say, base 2, return of composite found.
3. Recognise known-bad forms of composites for (say) k < 10
(i.e., solve a quadratic equation using for each k).  Return if
a factor found.
4. Run much fewer millerrabin iteration than normally needed...

(I haven't forgotten 3-factor composites, just ignored them for
simplicity...)

--
Torbjörn
```