>|> In article <1992Nov30.204147.7955@cimage.com> ejd@cimage.com (Ed Driscoll) writes:
>|> |The typedefs I proposed
>|> |would all specify size in bits, either exactly or at least that many bits.
>|> |Using your example, there might be an at-least-18-bits type and an
>|> |exactly-18-bits type (both with variations for signed and unsigned, I
>|> |suppose). So there'd be no way to get confused about how many bits you
>|> |were getting unless the typedefs themselves were incorrect.
>|>
>|> I don't understand the purpose of the "at-least-N-bits" types. C
>|> already has those -- char (>=8), short (>=16), int (>=16), and long
>|> (>=32). What does your extra level of indirection buy you?
>You are'nt correct:
>int is as long as a single machine word on the machine (on some machines this means 12bit), on this machines also a short is 12bit long and a long is 24bits long...
>The same is with characters
>Some compilers will give you only the original ASCII-Characterset and a char-value with this machines is 7bit long, all other bits stored in memory for this value will be ignored...
The behaviour you describe is not allowed in an ANSI compiler, which
must provide the minimum sizes that Ed describes.
[Please hit return every 72 characters or so -- it makes things like
inclusions and following the attributions much easier.]
--
"Insisting on perfect safety is for people who don't have the balls to live
in the real world." -- Mary Shafer, NASA Ames Dryden