home *** CD-ROM | disk | FTP | other *** search
- Path: sparky!uunet!dtix!darwin.sura.net!wupost!psuvax1!rutgers!njitgw.njit.edu!hertz.njit.edu!dic5340
- From: dic5340@hertz.njit.edu (David Charlap)
- Newsgroups: comp.os.os2.programmer
- Subject: Re: SHORT, USHORT, and other absurdities
- Message-ID: <1992Sep1.225220.20880@njitgw.njit.edu>
- Date: 1 Sep 92 22:52:20 GMT
- References: <1992Aug24.201535.27997@njitgw.njit.edu> <912@nazgul.UUCP>
- Sender: news@njit.edu
- Organization: New Jersey Institute of Technology, Newark, N.J.
- Lines: 109
- Nntp-Posting-Host: hertz.njit.edu
-
- In article <912@nazgul.UUCP> bright@nazgul.UUCP (Walter Bright) writes:
- >In article <1992Aug24.201535.27997@njitgw.njit.edu> dic5340@hertz.njit.edu (David Charlap) writes:
- >/Because the C types: short, long, and int are not fixed in size. int
- >/is simply the machine word, long is double that, and short is
- >/(usually) int. By aliassing, them, your program can assume the size.
- >/SHORT is always 2 bytes, LONG is always four. If you go to a system
- >/whose machine word is 4 bytes, then the headers will simply define
- >/SHORT as an array of 2 chars, LONG as an int, and there will be a new
- >/type for the long.
- >
- >Perhaps. But that isn't the way it works in practice. SHORT is used as
- >if arithmetic is allowed on it, and char[2] can't handle arithmetic
- >operations. The PM and Windows header files excessively and inconsistently
- >use the typedefs to the point where they are pretty well useless.
- >If you mean SHORT to be "exactly 16 bits, no more, no less" then it must
- >be documented so, or your users will assume it means "short", and will use
- >it such. When lots of code assumes SHORT means "short", then you cannot
- >change it to mean anything else. A superior approach is to typedef at a
- >higher level of abstraction, as in:
- > typedef int WINDOW_HANDLE;
- >Then WINDOW_HANDLE could later be typedef'd to be an int, short, double,
- >struct, void*, etc., and if you were consistent then user code only needs
- >recompiling. No such luck with SHORT.
-
- This is done in most cases. That's why you have types like HWND, HMQ,
- HAB, etc.
-
- If you read the documentation on data types (the CP reference in the
- 2.0 toolkit or the data types reference in the 1.3 toolkit), you'll
- see SHORT defined as "a signed integer on the range -32768 to 32767".
- That is the definition. Anything else is your definition. All the
- types are defined like this. The typedefs in the headers are for the
- compiler, not you.
-
- >Your code is not portable between 16 and 32 bit OS/2 anyway. IBM completely
- >changed the API.
-
- You obviously don't know this at all. I have successfully recompiled
- much 16-bit code with gcc/2. All I did was change the message
- parameter from USHORT to ULONG, and all compiled fine. Very few of
- the 16-bit API's have changed. Some have added new subfunctions, but
- the parameter lists have (mostly) gone unchanged. There are new API's
- with different names for the completely new functions.
-
- >/Etc. In some cases, it's not necessary (although char doesn't have to
- >/be one byte - especially in a double-byte-character OS, so CHAR is
- >/defines as the character size, and BYTE is a byte), but all base types
- >/are redefined to make the code more readable.
- >
- >A double byte character OS is much more likely to use wchar_t, not char.
- >Too much code depends on char being a byte. Windows NT is a unicode OS,
- >but a char is still a char. I take issue with your statement that
- >SHORT as short makes it more readable, *especially* if somebody redefines
- >SHORT to be something other than short!
-
- It shouldn't matter to you what SHORT is typedef'ed into. It is
- always "a signed integer with a range of -32768 to 32767". That's all
- you need to know.
-
- And code is much more readable if you can immediately tell a type from
- a variable by looking at the case. But that's preference and has no
- bearing on the OS.
-
- >/Why? All OS/2 programs use these types. The point is that short may
- >/not be 2 bytes on your hardware. SHORT will always be.
- >
- >Well, not my OS/2 programs. And I have the same code running on 16 and
- >32 bit OS/2. In fact, I compared the API's of 16 and 32 bit OS/2, and wrote
- >my own .h file for it which used things like "int" for sizes that changed.
- >The IBM .h files used "SHORT" for the 16 bit version and "LONG" for the
- >32 bit one. Ugh. I changed that to "int". No #ifdef's!
-
- I hope you always use compilers where int is 16 bits for 1.x code and
- is 32 bits for 2.x code. If you switch compilers to one that has, say
- a 16-bit int everywhere, or a 32-bit int everywhere, your code is
- going to bomb.
-
- >You could still write everything with "far", and if your compiler supported
- >"__far" instead, you could:
- > #define far __far
- >You haven't gained anything with FAR.
-
- But you'd have to do it. It's already been done for you. If you want
- 16-bit code, you have to type something anyway. Why not use an
- identifier that will be supported on everybody's compiler?
-
- >/The OS/2 headers attempt to make code independant of machine word
- >/sizes, which C does not do.
- >/Oddly enough, they never defined a MESSAGE type - messages still use
- >/USHORT or ULONG as their type.
- >
- >Making code independent of word sizes is best accomplished by typedef'ng
- >at the MESSAGE level, not at the SHORT or LONG level. I submit that 90%
- >of programs that scrupulously use SHORT and LONG typedefs will have to
- >be carefully gone through and adjusted if SHORT/LONG are *ever* set to
- >anything other than short/long.
-
- IT DOESN'T MATTER!!!! An API call that has a SHORT parameter needs a
- 16-bit signed integer as its parameter. Who cares if that size
- variable is called a short, long, char, or whatever?!?! All those
- sizes are undefined in the ANSI spec, and it is very poor practice to
- write code that depends on the size of one.
-
-
- --
- |) David Charlap "TELEPHONE, n. An Invention of the devil which
- /|_ dic5340@hertz.njit.edu abrogates some of the advantages of making a
- ((|,) disagreeable person keep his distance."
- ~|~ --- Ambrose Bierce, The Devil's Dictionary
-