Frank Heckenbach wrote:
Scott Moore wrote:
Frank Heckenbach wrote:
BTW, I'm not sure that C ABIs will actually change in the future. Maybe C programmers will simply become used to using `long' almost everywhere, just like programmers of Borland Pascal and, especially, compatible 32 bit compilers, use `LongInt' regularly ...
It would seem to come down to whether you might consider the C standard types as fixed to their original definitions on the DEC PDP-11 (actually PDP-8, to be pedantic), or if they were intended to "breathe" with the machine word size.
The ANSI version eliminated the reference to specific sizes, but I think the intent was clearly on the side of int being the natural register size for the machine.
Sure. But I'm not sure if the ABI for one particular target, once determined, will change in the future.
Future new 64 targets might well have 64 bit `int' indeed. But AFAIK for the existing 64 targets (such as Intel, AMD, Sparc, Alpha) ABIs have been determined, and I think it's unlikely to see completely new 64 bit targets soon ...
The ABIs in existance for AMD64 are:
1. The ABI as published by AMD. 2. The ABI advanced by Microsoft for Windows.
Linux has adapted the AMD ABI. The Microsoft ABI does not match it. They went their own way (surprise).
Both ABIs are C oriented, if that matters.
The AMD ABI states that int is 32 bits, but this makes no difference to the ABI, since each register parameter occupies a full 64 bit register. On the stack, all parameters are specified as aligned to 64 bits, which means that assuming int as 64 bit would make no practical difference there, either.
The net difference of int vs. long is in structures, where int takes only 32 bits. I don't really consider that a calling convention matter, since you can declare record elements as subranges (or whatever).
There is also clearly a precedent for int eventually promoting. The PDP-11 was 16 bits int, the PC was 16 bits int, and now 32 bits. The 16 bit to 32 bit promotion algorithim appears to have been to leave the int size at the "old" 16 bit meaning, then finally upgrading it to 32 bits when it became clear that 16 bit machines were dying off on the desktop.
But for any given target (16 bit Dos, 16 bit Windows, 32 bit Windows, 32 bit Linux, BSD, ...) there's only been one ABI, and the new ABIs discussed are already for 64 bit AMD|Intel Unix|Windows, aren't they? So I wouldn't expect a change here anytime soon.
Frank
I believe the (practical) result is both existing ABIs are int length agnostic. If I missed something, my apologies.