On 17 Dec 2002 at 2:48, Frank Heckenbach wrote:
What is a "plain" integer?
`int'
So it is an integer with an implementation-defined length.
The guarantee is that it's the same in GPC and GCC. I don't know if MS C guarantees anything.
MS C's "int" is also an integer with an implementation-defined length. The problem is that there is no guarantee that the two implementations define the length identically (which is clearly required for interoperability).
As long as GPC calls GCC code, all is well, because the types are guaranteed to be the same sizes. But when calling third-party interfaces, that assumption doesn't hold. A method is needed to guarantee that a given parameter is, e.g., 16 bits. "ShortInt" doesn't do that, if the GPC manual is to be believed.
AFAIK, there are GNU C interfaces? Which types do they use?
I don't know; is it relevant? They may have something that "works," but is it correct?
Is this a library interface or an ABI description?
ABI description (from their "Platform SDK" documentation).
...do you really expect the interfaces to change to `short' (or whatever is 32 bit then)? I wouldn't. I'd expect the C types to remain the same, even if the sizes vary.
My expectations are the same, and that's the problem.
MS went from an interface where "int" was 16 bits to an interface where "int" was 32 bits, and I understand that they will be providing an interface where "int" is 64 bits. The 16- and 32-bit interfaces co-existed for a time. I expect the 32- and 64-bit interfaces will as well.
Some years ago, I wrote a program in MS Pascal (!) to call to the 16-bit interface. It worked fine under 16-bit Windows. I moved it to GPC but retained the calls to the 16-bit routines, and then it failed because GPC assumed 32-bit Integers, whereas MS Pascal assumed 16-bit integers. The program worked fine again -- with GPC -- once all of the "Integer"s in the interface definition were changed to "ShortInt"s.
If I write a program today to call the 32-bit interface, and I use "Integer" in GPC, all is well...until GPC changes to use 64-bit "int"s, and then, presumably, Integer becomes 64 bits as well. At that point, if I still call the 32-bit interface, my program fails once again until I fix up all of the parameter types.
GPC's sized integers appear to be intended to address this very problem. An "Integer (16)" stays 16-bits always, regardless of what GPC uses for "Integer" this year. If a particular interface demands certain-sized parameters, e.g., a 32-bit Boolean, then it appears to me that "Boolean (32)" will always produce the correct type, whereas "WordBool" will produce the correct type now but may not in the future. Surely that is what the GPC manual cautions in the section on sized integers.
(At least that's what I know from other systems, and which makes porting to 64 bits generally rather easy, just recompiling with a compiler for the new target -- as long as you didn't make any assumptions....
But presuming that "int" will be the same across platforms is just such an assumption. (And indeed, as I recall, MS had a lot of initial trouble moving from 16-bit to 32-bit interface calls, because under 16-bit Windows, there were a number of 32-bit parameters that were declared as "long"s -- which were 32-bits under the 16-bit compiler but became 64-bits under the 32-bit compiler. And so a lot of programs failed to work when recompiled, because they remained 32-bit parameters under the new interface.)
The problem is that C doesn't have, as far as I know, a method of declaring an integer of a guaranteed size, which makes interfacing across compilers problematic. You can pick some types that happen to work today, but they aren't guaranteed to work tomorrow. ISO Pascal, of course, has the same problem. GPC appears to solve that problem by providing sized integers. (Ada also recognizes this problem and solves it by providing optional "representation attributes" to guarantee a particular size where needed.)
"WordBool" and "Boolean (32)" solve different problems: the former when interfacing to a GNU C routine that uses "unsigned int" and where the parameter size is intended to track changes in the size of GNU C's "int", and the latter when interfacing to third-party routines that require specific parameter sizes.
-- Dave