problem with NTL 5.5 when upgrading from GMP 4.2.4 to 4.3.0
shoup at cims.nyu.edu
shoup at cims.nyu.edu
Sat Apr 25 20:57:08 CEST 2009
Paul,
Thabnks for all the info.
I just got back to my email.
This presents some problems, as NTL documents its XGCD routine
quite precisely -- that it returns the coefficients as computed
by Euclid's algorithm. I'd prefer to keep things that way.
I have two options.
The first is to use another implementation of XGCD, bypassing
GMP completely. The code is actuallythere, commented out,
but it is surely not as fast as GMP's.
The second is to use GMP's XGCD, and to post-process the results.
The magnitides of the cofactors can be made unique. Unfortunately,
the signs of the cofactors cannot. You really need to run Euclid's
algorithm to at least get the parity of the number of division steps.
I believe at some point I convinced myself that GMP's XGCD routine's
documentation said that the cofactors are "as returned by the extended
Euclidean algorithm", and I based my implementation on this.
I don't believe many applications really care about all of this *too* much.
I'll have to take a closer look.
I might have to opt for loosening up the guarantees made by NTL.
-- Victor
> Tobrjörn,
>
>> Should we slow down the code a little and get a more strictly bounded
>> cofactor value for the price? The price is small, but if the operands
>> are just a few limbs, it might still hurt.
>
> (a) having loose bounds will help to detect bugs in applications, as this
> example shows
>
> (b) having strict bounds will help applications, indeed some of them might
> have to do some extra work to get unique cofactors, or the correctness
> of some algorithm might depend on strict bounds
>
> I vote for (b) if the extra price is small.
>
> Paul
>
>
More information about the gmp-bugs
mailing list