Could we agree to disagree and come together on improvements andclean-up?
Joerg Arndt
arndt at jjj.de
Sun Jun 1 13:02:08 CEST 2008
* Vincent Lefevre <vincent at vinc17.org> [Jun 01. 2008 20:12]:
> On 2008-05-31 17:11:59 +0200, Jan Wielemaker wrote:
> > Hopefully without starting another flamewar, let us conclude there is
> > only one party to blame. If any other (read smaller) party would have
> > done this, they would have been completely ignored by developers. And
> > completely right. When I learned C, the biggest int was a long and long
> > always fitted pointers (yes, I've seen 8,16,32 and 64 bits).
>
> AFAIK, the C standard has never required this (BTW, even if a pointer
> size is not larger than the size of a long, the pointer-to-integer
> conversion is implementation-defined and possibly undefined, so that
> you can't deduce anything about it in portable code).
>
> > Breaking this, certainly in the light that 32-bits are getting small
> > as a general purpose integer, is unforgivable.
>
> No, a code based on such an undefined behavior and meant to work on
> any C implementation (e.g. a future implementation) is buggy, just
> like code that assumes that signed integer arithmetic and pointer
> arithmetic is done modulo 2^(size of the data type).
>
> Note: Other people could complain that old implementations had a long
> that was always 32-bit and breaking that would be unforgivable. So,
> to make everyone happy, integer types larger than 32 bits would not
> have been possible.
>
> --
> Vincent Lefèvre <vincent at vinc17.org> - Web: <http://www.vinc17.org/>
> 100% accessible validated (X)HTML - Blog: <http://www.vinc17.org/blog/>
> Work: CR INRIA - computer arithmetic / Arenaire project (LIP, ENS-Lyon)
Avoiding stupid things like casting pointers to integers
(or better, avoiding pointer casts to anything but other pointers
as defined by the standard) is a must.
However, taking care of everything allowed by the standard is IMHO
unwise. For one example, ones complement, I'd be surprised if more
than 1 percent of all existing code would do what it's meant to do
when compiled on a ones-complement environment.
The best one could do is a completely unoptimized branch in the code.
This may save as a brigde for new archs/models. Given the huge amount
of work required I'd be surprised if anyone would start this.
Roughly, the code should work on what I call a sane arch/model:
- long is machine word,
- byte order is little- or big- endian (not 1342 or such),
- two's complement,
- no exception on integer overflow,
- bits per int/long is a power of two
- [add your sanity requirements].
Not-so-sane archs/models _will_ require much extra work but
I do not see any point in anticipating some vendor will do
the silly thing to ship such an environment.
Should there be the need to have a compiles-everywhere version
of GMP for gcc I'd expect the gcc people to help out.
cheers, jj
More information about the gmp-discuss
mailing list