J. David Bryan wrote:
`int'
So it is an integer with an implementation-defined length.
Not really implementation defined, but ABI defined.
The guarantee is that it's the same in GPC and GCC. I don't know if MS C guarantees anything.
MS C's "int" is also an integer with an implementation-defined length. The problem is that there is no guarantee that the two implementations define the length identically (which is clearly required for interoperability).
The ABI gives this guarantee (that's its purpose).
AFAIK, there are GNU C interfaces? Which types do they use?
I don't know; is it relevant? They may have something that "works," but is it correct?
If it isn't, the thing to do is to fix it there (i.e., in GCC's ABI). In free software, you try to fix the problems at the root whenever possible (and it would be possible here since the "root" in this context would be GCC) ...
Is this a library interface or an ABI description?
ABI description (from their "Platform SDK" documentation).
And do you have any reason to doubt that GCC follows the ABI? If so, the correct thing to do is to report this to the GCC maintainers for this target and ask them to fix it.
...do you really expect the interfaces to change to `short' (or whatever is 32 bit then)? I wouldn't. I'd expect the C types to remain the same, even if the sizes vary.
My expectations are the same, and that's the problem.
No, that's the solution. :-)
Some years ago, I wrote a program in MS Pascal (!) to call to the 16-bit interface. It worked fine under 16-bit Windows. I moved it to GPC but retained the calls to the 16-bit routines, and then it failed because GPC assumed 32-bit Integers, whereas MS Pascal assumed 16-bit integers. The program worked fine again -- with GPC -- once all of the "Integer"s in the interface definition were changed to "ShortInt"s.
So MS Pascal didn't follow the ABI. How is this relevant to this discussion?
If I write a program today to call the 32-bit interface, and I use "Integer" in GPC, all is well...until GPC changes to use 64-bit "int"s, and then, presumably, Integer becomes 64 bits as well. At that point, if I still call the 32-bit interface, my program fails once again until I fix up all of the parameter types.
GPC's sized integers appear to be intended to address this very problem. An "Integer (16)" stays 16-bits always, regardless of what GPC uses for "Integer" this year.
Maybe that's the source of the confusion. The ABI doesn't change every year -- in fact, once it's established, it should never change. 64 bit Windows will be a new target with a new ABI. So then you'll have a 32 and a 64 bit Windows target. Maybe the system will (for some time) support both platforms, then you can choose which one you use.
(At least that's what I know from other systems, and which makes porting to 64 bits generally rather easy, just recompiling with a compiler for the new target -- as long as you didn't make any assumptions....
But presuming that "int" will be the same across platforms is just such an assumption.
I'm not presuming this. `int' may well be 64 on 64 bit Windows if that's according to its ABI. Or it may remain 32 bits. I don't know.
(And indeed, as I recall, MS had a lot of initial trouble moving from 16-bit to 32-bit interface calls, because under 16-bit Windows, there were a number of 32-bit parameters that were declared as "long"s -- which were 32-bits under the 16-bit compiler but became 64-bits under the 32-bit compiler. And so a lot of programs failed to work when recompiled, because they remained 32-bit parameters under the new interface.)
So the interface was changed (against the ABI). Of course, this may happen, but in general it's more likely to assume that the 64 bit interfaces will follow the 64 bit ABI then the 32 bit one.
"WordBool" and "Boolean (32)" solve different problems: the former when interfacing to a GNU C routine that uses "unsigned int" and where the parameter size is intended to track changes in the size of GNU C's "int", and the latter when interfacing to third-party routines that require specific parameter sizes.
Not really. It's meant more for things like file formats (with the problems of endianness etc. that Chuck mentioned) or memory layout (e.g., of memory mapped hardware), i.e. things which must have the same size on every platform. Interfacing to external code is better done via ABI specifications.
Prof. A Olowofoyeku (The African Chief) wrote:
On 18 Dec 2002 at 4:25, Frank Heckenbach wrote:
[...]
AFAIK, there are GNU C interfaces? Which types do they use?
All sorts of typedefs and macros mapping various things to various other GNU things. It is easier for gcc, since they are dealing with interfaces designed for C programmers.
But once it's done in GCC, it can be strgaihtforwardly mapped to GPC (if you figure out the typedefs and macros, that is ;-).
Perhaps. The interfaces are done as a separate project (w32api) and not as part of the standard GCC,
Of course. The ABI is done within GCC, and mapped 1:1 in GPC. The C interfaces use the ABI, and therefore can also be mapped 1:1 in Pascal.
and it is all in C of course. But, yes, once one can see how the C interface has been implemented, it is just a "simple" case of converting "them" to Pascal (and there is a lot of "them" to be converted ...) if one can figure out what they are doing,
Exactly. (Maybe one day there will be an automatic converter, but don't hold your breath. If/when it will be written, it will use this 1:1 type mapping, of course.)
AND there is a way to do it in Pascal. Some unions and structures are impossible to convert because there is no way to achieve the conversion in Pascal,
I don't remember if we talked about this, but can you give an example for something not convertible?
CBFalconer wrote:
You have further problems to do with "who removes the parameters". Note that you can generate and call with one single large block, which you treat as one parameter, and the destination treats as a collection of items. Treatment of returned values is another whole area.
This is also resolved already in the ABI and partly using attributes.
Maurice Lombardi wrote:
So keep the actual status: Shortint, integer(16) and also the standard packed(-32768..32767) all defined.
Well, I guess I'll leave everything as it is. Actually, I'm getting a little tired of this discussion, and unless really new points are brought up, I probably won't reply anymore.
Keep in mind that there is a parsing conflict. I'm not working on it now. But when I will, and if it turns out that the `Integer (16)' etc. types are a serious problem in this regard, I'll simply drop them then (a bug is always more important than some non-standard feature). This means all who use it will have to change their code quite suddenly then.
I tried to arrange for a smoother change, but the "don't change anything" fraction seems to be in the majority. The alternative suggestions from Waldek and me were criticized or neglected.
I don't have more time to waste on these matters now.
Frank