Hello,
as I'm interrested in free software pascal compilers, I'd like to hear about your opinions about the others.
I don't want to start a flamewar - of course yours's the best! But what experiences do you have with the others?
I know Free Pascal very well. It comes with a lot of extensions, is fast and produces small binaries. But when you want to program for different platforms, you'll end up with lots of {$IfDef}s.
I had a look at p2cc, but it seems very weak, isn't it?
Has anybody experiences with the pascal compiler from the ACK package? http://tack.sf.net/
Are there others?
Am Thursday, dem 22. Sep 2005 schrieb Andres K. Foerster:
Has anybody experiences with the pascal compiler from the ACK package?
I had a look at it.
It's also very old and there's actually no native compiler for modern systems. ACK comes from Minix, which meanwhile is free software, but had no update for over 10 years now.
On GNU/Linux systems it's only possible to compile it into an intermediate code, which can be used in a virtual machine.
(Resending to gpc@gnu.de -- I noticed the original mail was sent to gpc-de@gnu.de and later resent to gpc@gnu.de, but my replay was only sent to gpc-de@gnu.de, wrongly.)
Andres K. Foerster wrote:
as I'm interrested in free software pascal compilers, I'd like to hear about your opinions about the others.
I don't want to start a flamewar - of course yours's the best! But what experiences do you have with the others?
I know Free Pascal very well. It comes with a lot of extensions, is fast and produces small binaries. But when you want to program for different platforms, you'll end up with lots of {$IfDef}s.
This is one of my main criticisms -- putting the burden of making the code portable on the programmer, rather than designing the system (as much as possible) portable by itself.
The other one is the omission of standard Pascal features. (I'm not so much opposed to extensions, IMHO omissions are much worse.) And from the discussions I've had/seen, this is not accidental, but fully intentional, as there seems to be some kind of hatred and ignorance against the standards among the developers. While actually many standard features would seem rather easy to add. A few, such as schemata, are probably not so easy, but actually those would provide some major benefits (in programming comfort/simplicity to the programmer), compared to the Borland "alternatives" (which can hardly be called so).
I had a look at p2cc, but it seems very weak, isn't it?
I've almost never used it, but I think so.
Frank
Frank Heckenbach wrote:
I know Free Pascal very well. It comes with a lot of extensions, is fast and produces small binaries. But when you want to program for different platforms, you'll end up with lots of {$IfDef}s.
This is one of my main criticisms -- putting the burden of making the code portable on the programmer, rather than designing the system (as much as possible) portable by itself.
Huh? That's new to me. We try to design everything as portable as possible even taking care of non mainstream platforms.
The other one is the omission of standard Pascal features. (I'm not so much opposed to extensions, IMHO omissions are much worse.) And
from the discussions I've had/seen, this is not accidental, but
fully intentional, as there seems to be some kind of hatred and ignorance against the standards among the developers.
Well, FPC is made by some community which contributes patches and it doesn't seem so that somebody contribute a patch for a ansi mode so far which is probably useless anyways as long as the PVS isn't free in the gnu sense.
Anyways, why should somebody care? If you want a compiler trying to be iso standard compliant: use GPC; if you want a compiler trying to be Borland compatible: use FPC.
While actually many standard features would seem rather easy to add. A few, such as schemata, are probably not so easy, but actually those would provide some major benefits (in programming comfort/simplicity to the programmer), compared to the Borland "alternatives" (which can hardly be called so).
Am Friday, dem 23. Sep 2005 schrieb Florian Klaempfl:
I know Free Pascal very well. It comes with a lot of extensions, is fast and produces small binaries. But when you want to program for different platforms, you'll end up with lots of {$IfDef}s.
This is one of my main criticisms -- putting the burden of making the code portable on the programmer, rather than designing the system (as much as possible) portable by itself.
Huh? That's new to me. We try to design everything as portable as possible even taking care of non mainstream platforms.
Yes, both compilers *try* to be portable in the end. But they take very different paths to achieve it.
In FPC often system specific things are implemented for different systems and then wrappers are written to make it portable afterwards. In GPC nothing is implemented until it is portable.
The disadvantage for FPC is, that there are a lot of system specific units in the package, while the disavantage for GPC is, that there are much less units in the package.
The other one is the omission of standard Pascal features. (I'm not so much opposed to extensions, IMHO omissions are much worse.) And
from the discussions I've had/seen, this is not accidental, but
fully intentional, as there seems to be some kind of hatred and ignorance against the standards among the developers.
Well, FPC is made by some community which contributes patches and it doesn't seem so that somebody contribute a patch for a ansi mode so far
That doesn't mean, that ansi mode is not wanted. I think that is, because most FPC people are coming from the Windows/Delphi world.
I see a lot of things in Extended Pascal, which I really miss in FPC. That's why I'm moving away from FPC towards GPC. The only thing, that holds me back from GPC is the trouble on the MinGW platform and the very large binaries it produces. But the language IS better.
which is probably useless anyways as long as the PVS isn't free in the gnu sense.
What is PVS?
Whether the standards are open, doesn't really matter for a GNU project. See GCJ or DotGNU. Even POSIX is not freely available. But the standard papers for Pascal and Extended Pascal are available.
Anyways, why should somebody care? If you want a compiler trying to be iso standard compliant: use GPC; if you want a compiler trying to be Borland compatible: use FPC.
But you must admit, that most Borland extensions are very DOS/Windows specific, while the extensions from Extended Pascal are designed to be very portable.
While actually many standard features would seem rather easy to add. A few, such as schemata, are probably not so easy, but actually those would provide some major benefits (in programming comfort/simplicity to the programmer), compared to the Borland "alternatives" (which can hardly be called so).
Andres K. Foerster wrote:
I won't discussion the language parts because it's simply a matter of believe ;) Only a small remark:
Yes, both compilers *try* to be portable in the end. But they take very different paths to achieve it.
In FPC often system specific things are implemented for different systems and then wrappers are written to make it portable afterwards. In GPC nothing is implemented until it is portable.
The disadvantage for FPC is, that there are a lot of system specific units in the package, while the disavantage for GPC is, that there are much less units in the package.
[...]
holds me back from GPC is the trouble on the MinGW platform and the very large binaries it produces.
The big executables are probably the price for GPC's approach: e.g. a crt unit based on ncurses talking to the terminal or the win api generates of course bigger executables than a crt unit talking directly to the terminal or windows api.
People measure usually the size of FPC executables against delphi and the FPC ones are bigger, simply because the FPC rtl contains some abstraction layers, that's simply the price for making things portable.
Further, we implement a lot of stuff ourself in fpc simply because experiences showed that it is e.g. almost impossible to deploy linux binaries working on all not too old linux platforms if they are linked against glibc.
Deploying only sources isn't a real option for me because pascal is also used a lot for teaching and you can't expect people just learning to program also to compile and install a compiler from the sources.
About the ISO mode: if a feature is important enough for fpc, people pop up to implement it, history teached that :)
Andres K. Foerster wrote:
which is probably useless anyways as long as the PVS isn't free in the gnu sense.
What is PVS?
Pascal Validation Suite.
Whether the standards are open, doesn't really matter for a GNU project.
Indeed. AFAIK, there isn't even a Borland Pascal validation suite (unless perhaps hidden inside Borland, but I've never heard of one), so is it impossible to aim for BP compatibility? Obviously not, as both projects do so.
And even a non-free test suite can be useful (though less useful than a fully free one, of course). If someone ran the suite through GPC, and only told me which tests fail (and possibly for which reasons), this would already help me. (Scott Moore indicated to me that he is willing to do this after the next GPC release, now that he OCR'd the PVS from the PUG newsletters.) Probably we can then write our own (free) tests for whichever new issues will be found, based on the descriptions. And if we finally get a report that GPC passes a non-free PVS, that's (in effect) almost as good as passing a free PVS.
And of course, a test suite can never cover every aspect of a standard, so the PVS isn't everything, anyway. Many users of GPC have found ISO deviations which have been fixed meanwhile and while probably not all were covered by PVS. Some time ago, Artur Zaroda who develops another Pascal compiler, ran his own tests on GPC and in the process found and reported some 50 ISO bugs to us. And not least, Waldek and I have carefully studied various aspects of the standards and thus found some GPC bugs (and sometimes even "bugs" in the standards).
Actually I think, thought the standard's language is not very readable, it's ultimately easier than providing BP compatibility. With BP, we (as well as you) have the manual and the compiler. In cases of doubt, we can read the manual and test the compiler. If they disadgree (as they do in a few cases), we can only guess which behaviour is more reasonable (and there's no authoritative statement such as a standard document -- in fact, I think most programmers coming from BP would consider the compiler's behaviour more important than what the manual says, in case of disagreement). And, of course, there are many cases where a behaviour is not specified in the manual (and usually should not be specified IMHO), but BP programmers rely on a particular behaviour that exists under BP (sometimes only as a coindicence of some Dos pecularities or something). I suppose you know this just as well as I do ...
So in summary, IMHO, the first steps may be easier in BP compatibility, but when it comes to the hard cases, a written standard is better. Of course, a complete, correct and free test suite would be best, but this is something that doesn't exist anywhere.
Anyways, why should somebody care? If you want a compiler trying to be iso standard compliant: use GPC; if you want a compiler trying to be Borland compatible: use FPC.
But you must admit, that most Borland extensions are very DOS/Windows specific, while the extensions from Extended Pascal are designed to be very portable.
And also many are rather low-level, including the stuff they blindly copied from C (#0-terminated strings, 0-based arrays in open arrays, etc.) ...
But again, I'm not so much complaining about extensions. I use some of the better Borland extensions myself in code. But I wouldn't want to live with many of the Borland omissions anymore. Sometimes I use GPC as an ISO compliant compliler, sometimes (especially for my older code that I originally wrote under BP) I use it as a "better BP", which is BP compatible but allows me to remove many kludges I had to do under BP.
Frank
Am Friday, dem 23. Sep 2005 schrieb Frank Heckenbach:
to live with many of the Borland omissions anymore. Sometimes I use GPC as an ISO compliant compliler, sometimes (especially for my older code that I originally wrote under BP) I use it as a "better BP", which is BP compatible but allows me to remove many kludges I had to do under BP.
I get the impression that you Frank Heckenbach and Florian Klaempfl talk at cross purposes.
When you, Frank, read about Borland compatiblity, you only think about the old Borland Pascal for DOS, while Florian means also compatiblity to Delphi.
When you, Florian, read about ansi compatiblity, you only think about the old ANSI Standard Pascal, while Frank means also compatiblity to the ANSI Extended Pascal standard.
Am Friday, dem 23. Sep 2005 schrieb Frank Heckenbach:
But you must admit, that most Borland extensions are very DOS/Windows specific, while the extensions from Extended Pascal are designed to be very portable.
And also many are rather low-level, including the stuff they blindly copied from C (#0-terminated strings, 0-based arrays in open arrays, etc.) ...
Well, but there are also some extensions, which I really miss in GPC. And I hope, both will get a little more compatible to each other. So, please take it as a wishlist.
- Mainly I like the so called AnsiString. http://community.freepascal.org:10000/docs-html/ref/refsu10.html - Namespaces. I mean in FPC you can define functions with the same name in different units and you can define which one to use like this: crt.readkey or graph.readkey - Classes http://community.freepascal.org:10000/docs-html/ref/refch6.html
Andres K. Foerster wrote:
Well, but there are also some extensions, which I really miss in GPC. And I hope, both will get a little more compatible to each other. So, please take it as a wishlist.
- Mainly I like the so called AnsiString. http://community.freepascal.org:10000/docs-html/ref/refsu10.html
- Namespaces. I mean in FPC you can define functions with the same name in different units and you can define which one to use like this: crt.readkey or graph.readkey
Done. And it is really not Borland specific.
In-progress (search archives for threads about Mac objects and Delphi classes).
Attached is a fairly rigorous test program for Extended Pascal strings. It was written by Tony Heatherington at Prospero Software; Heatherington was a member of the Extended Pascal committee. I have commented-out the 3 Prospero-specific tests.
gpc does well on this test, but fails one syntax test and several runtime ones.
The syntax problem is in using a parameter to define a string capacity. That can be temporarily worked around by substituting (14) for (n) on line 65.
Most of the runtime errors are generated by a single problem with conformance, regarding string equality. The relevant standard text, 6.8.3.5, is below, after the string test program.
Willett Kempton
willett wrote:
Attached is a fairly rigorous test program for Extended Pascal strings. It was written by Tony Heatherington at Prospero Software; Heatherington was a member of the Extended Pascal committee. I have commented-out the 3 Prospero-specific tests.
gpc does well on this test, but fails one syntax test and several runtime ones.
The syntax problem is in using a parameter to define a string capacity. That can be temporarily worked around by substituting (14) for (n) on line 65.
More precisely, it's a semantic, not a syntax problem, but indeed it's a GPC bug. GPC doesn't yet support initialized variables of non-constant size. Removing the `value ...' and doing the assignment in code also works around the problem. (Though indeed the error message does not indicate it's about initialization.)
Most of the runtime errors are generated by a single problem with conformance, regarding string equality. The relevant standard text, 6.8.3.5, is below, after the string test program.
GPC by default does exact string comparisons, without blank padding. To get the (broken IMHO) EP behaviour, one can use --extended-pascal (which requires other changes in the test, such as avoiding to use Word, SetLength etc.), or --no-exact-compare-strings. With this option I get rid of Fail 11 and 12. Then I get a range-check error in this line:
IF x1[j*k..20][1] <> 'k' THEN fail(34);
This is because GPC does not shift the range of string slices to 1..x by default, only in EP mode. Currently we don't have a separate feature options for this, so I tried with --extended-pascal now (removing the non-EP parts). I then get:
actual schema discriminants do not match
in this line:
valueparam(fval,ls2);
Though the text of the message is somewhat misleading, the error (according to strict EP rules) is correct (i.e., the test is not EP compliant), as Length (fval) = 14 and Length (ls2) = 7:
: 6.7.3.2 Value parameters : : If the parameterÂform of the valueÂparameterÂspecification contains a : schemaÂname that denotes the schema denoted by the required : schemaÂidentifier string, then each corresponding actualÂparameter contained : by the activationÂpoint of an activation shall possess a type having an : underlyingÂtype that is a stringÂtype or the charÂtype; it shall be an error : if the values of these underlyingÂtypes, associated with the values denoted : by the actualÂparameters, do not all have the same length.
Fix:
PROCEDURE valueparam(fst1: s20; fst2: s20);
Then I get Fail 71. This test assumes a particular default field width in WriteStr which EP does not guarantee. Fix:
writestr(lstring,i:6,r:7:2,ch,lpac);
Then it passes all tests.
Thanks for running these tests!
Frank
Frank Heckenbach wrote:
This is because GPC does not shift the range of string slices to 1..x by default, only in EP mode. Currently we don't have a separate feature options for this, so I tried with --extended-pascal now (removing the non-EP parts).
The non-EP part is called `StringExtensions', so we can switch to {$gnu-pascal}
I then get:
actual schema discriminants do not match
in this line:
valueparam(fval,ls2);
Though the text of the message is somewhat misleading, the error (according to strict EP rules) is correct (i.e., the test is not EP compliant), as Length (fval) = 14 and Length (ls2) = 7:
: 6.7.3.2 Value parameters : : If the parameterform of the valueparameterspecification contains a : schemaname that denotes the schema denoted by the required : schemaidentifier string, then each corresponding actualparameter contained : by the activationpoint of an activation shall possess a type having an : underlyingtype that is a stringtype or the chartype; it shall be an error : if the values of these underlyingtypes, associated with the values denoted : by the actualparameters, do not all have the same length.
I think that GPC is wrong here. Namely, we have:
TYPE ... s20 = string(20);
so s20 does _not_ denote the `string' schema, it is merely a type produced from that schema with a tuple.
6.7.3.2 applies only for declarations of form:
PROCEDURE valueparam(fst1, fst2: string);
That is clear if you think about intended implementation: for discriminated schema you pass just the data. For undiscriminated schema you pass the data and a descriptor. If multiple parameters have the same type you can pass a single (common) descriptor. IMHO possibility of such optimization is the only justification for 6.7.3.2. Of course, our implementation is quite different...
On 13 Nov 2005, at 09:15, Waldek Hebisch wrote:
Frank Heckenbach wrote:
This is because GPC does not shift the range of string slices to 1..x by default, only in EP mode. Currently we don't have a separate feature options for this, so I tried with --extended-pascal now (removing the non-EP parts).
The non-EP part is called `StringExtensions', so we can switch to {$gnu-pascal}
I then get:
actual schema discriminants do not match
in this line:
valueparam(fval,ls2);
Though the text of the message is somewhat misleading, the error (according to strict EP rules) is correct (i.e., the test is not EP compliant), as Length (fval) = 14 and Length (ls2) = 7:
: 6.7.3.2 Value parameters : : If the parameter form of the value parameter specification contains a : schema name that denotes the schema denoted by the required : schema identifier string, then each corresponding actual parameter contained : by the activation point of an activation shall possess a type having an : underlying type that is a string type or the char type; it shall be an error : if the values of these underlying types, associated with the values denoted : by the actual parameters, do not all have the same length.
I think that GPC is wrong here. Namely, we have:
TYPE ... s20 = string(20);
so s20 does _not_ denote the `string' schema, it is merely a type produced from that schema with a tuple.
6.7.3.2 applies only for declarations of form:
PROCEDURE valueparam(fst1, fst2: string);
That is clear if you think about intended implementation: for discriminated schema you pass just the data. For undiscriminated schema you pass the data and a descriptor. If multiple parameters have the same type you can pass a single (common) descriptor. IMHO possibility of such optimization is the only justification for 6.7.3.2. Of course, our implementation is quite different...
-- Waldek Hebisch hebisch@math.uni.wroc.pl
Frank, Waldek and others: Below I attach a note from Tony Heatherington, which was actually answering an earlier string implementation question I raised with him, but parts of which pertain to this question.
Tony: I thought you might be interested in this discussion. Ask us to 'cc' if you are interested, or join the gpc list.
Willett
Willett,
Very many apologies; your message about strings arrived while I was away, and because it clearly needed thinking about it was put on one side, and I have just retrieved it. By now it may well be too late to be useful, but let me have a go anyhow.
I think that the Standard is worded, at least in part, in a way that is intended to allow various representations of string values, and methods of parameter passing. When you say:
So, VAR parameter, capacity is same as actual parameter's capacity. Value parameter, capacity is same as length of actual parameter.
you are right, but it can be a bit more complicated. (Oh dear, it is more that ten years since I looked seriously at all this!) One question is what happens to (VAR a,b,c: string) or (a,b,c: string), and the stuff about "it shall be an error" is really concerned with such cases. Do you pass one hidden parameter or three? The Standard does not want to prejudge that decision. And does the representation of string values have a length count at the beginning or a null terminator? In either case there is a hidden parameter for each actual, but with the null terminator getting the capacity is more tedious. As you may remember, for better or worse I adopted a format for strings that puts a null at the end and also a count of unused positions (which coincides with the null when the length is the same as the capacity). It means that the capacity has to be known for some purposes which would not require it with a leading length count, for example, but allows string values to be passed in Windows API calls.
Also, because the actual corresponding to a value formal can be any string expression, the method that the implementation uses for evaluating string expressions and returning values from string functions has to be stirred in as well. And yet again, the domain type of a pointer can also be a schema, for example "string", and it can help to deal with these in a related way. Do you have one-byte or two-byte characters, or both?
Probably by now you have worked all this out, and again I apologise for not responding more promptly. If it would help to expand any more, do let me know.
... All the best,
Tony
--
Prospero Development Software London SW18 1PY Tel: +44 20 8875 9011
Waldek Hebisch wrote:
I then get:
actual schema discriminants do not match
in this line:
valueparam(fval,ls2);
Though the text of the message is somewhat misleading, the error (according to strict EP rules) is correct (i.e., the test is not EP compliant), as Length (fval) = 14 and Length (ls2) = 7:
: 6.7.3.2 Value parameters : : If the parameterÂform of the valueÂparameterÂspecification contains a : schemaÂname that denotes the schema denoted by the required : schemaÂidentifier string, then each corresponding actualÂparameter contained : by the activationÂpoint of an activation shall possess a type having an : underlyingÂtype that is a stringÂtype or the charÂtype; it shall be an error : if the values of these underlyingÂtypes, associated with the values denoted : by the actualÂparameters, do not all have the same length.
I think that GPC is wrong here. Namely, we have:
TYPE ... s20 = string(20);
so s20 does _not_ denote the `string' schema, it is merely a type produced from that schema with a tuple.
6.7.3.2 applies only for declarations of form:
PROCEDURE valueparam(fst1, fst2: string);
That is clear if you think about intended implementation: for discriminated schema you pass just the data. For undiscriminated schema you pass the data and a descriptor. If multiple parameters have the same type you can pass a single (common) descriptor. IMHO possibility of such optimization is the only justification for 6.7.3.2. Of course, our implementation is quite different...
Oh yes, I had misread this case (that's why GPC's error message appeared strange to me, should have looked closer ...)
This patch fixes this bug, and also a similar one with string var-parameters, if I understand this case right now (the statement `p (w^)' in fjf1098b.pas is correct, isn't it?).
{$extended-pascal}
program fjf1098a (Output);
type s = String (20);
procedure p (a, b: s); begin WriteLn (a, b) end;
begin p ('OK', '') end.
{$extended-pascal}
program fjf1098b (Output);
type s = String (20);
procedure p (var a: s); begin Write (a) end;
var v: ^s; w: ^String;
begin New (v); v^ := 'O'; p (v^); New (w, 20); w^ := 'K'; p (w^); WriteLn end.
Frank
Frank Heckenbach wrote:
This patch fixes this bug, and also a similar one with string var-parameters, if I understand this case right now (the statement `p (w^)' in fjf1098b.pas is correct, isn't it?).
Yes, I think fjf1098b.pas is correct. Thanks.
(from prior thread: Re: test program for Extended Pascal strings)
On 13 Nov 2005, at 08:40, Frank Heckenbach wrote:
Then I get a range-check error in this line:
IF x1[j*k..20][1] <> 'k' THEN fail(34);
This is because GPC does not shift the range of string slices to 1..x by default, only in EP mode. Currently we don't have a separate feature options for this, so I tried with --extended-pascal now (removing the non-EP parts). I then get:
Rather than adding a separate feature option, why not make shifting the range of string slices the default behavior? Aren't string slices a feature available only in Extended Pascal implementations, anyway?
Willett Kempton
willett wrote:
(from prior thread: Re: test program for Extended Pascal strings)
On 13 Nov 2005, at 08:40, Frank Heckenbach wrote:
Then I get a range-check error in this line:
IF x1[j*k..20][1] <> 'k' THEN fail(34);
This is because GPC does not shift the range of string slices to 1..x by default, only in EP mode. Currently we don't have a separate feature options for this, so I tried with --extended-pascal now (removing the non-EP parts). I then get:
Rather than adding a separate feature option, why not make shifting the range of string slices the default behavior? Aren't string slices a feature available only in Extended Pascal implementations, anyway?
GPC also supports string slices in its dialect, and also slices of other arrays. The latter is why we don't shift -- for general arrays there's no natural lower bound, so keeping the original range seems the only reasonable thing to do.(*) Then, if we do so for general arrays, it seems consistent to do so for strings, as otherwise we'd get strange behaviour:
var a: packed array [1 .. 10] of Char; b: packed array [0 .. 10] of Char;
a[5 .. 7] -> range 1 .. 3 according to EP
b[5 .. 7] -> range 5 .. 7 as it's no string
(According to strict EP rules, even the lack of `packed', which makes no implementation difference for Char-arrays in GPC, would have such an effect.)
(*) Perhaps one could think about always shifting to the lower bound of the actual array, which would be EP compatible in the case of strings. But it would also have strange effects, such as adding an element to the lower side of an array (say, changing from 1.. to 0.., would affect the result of slices far apart, both in terms of array elements and code position).
It may be argued to abandon non-string slice accesses altogether. So far, in most cases where they seemed useful, they actually weren't, e.g. due to strict type-checking, and to actually make them useful, we'd need to do more work, such as special type-compatibility rules, as have recently been discussed.
OTOH, array slices may be the only syntactically clean way (so far) to copy or fill parts of an array (instead of Move and FillChar), so there may be merit in having them (with special type-checking rules) ...
Frank
On 14 Nov 2005, at 13:12, Frank Heckenbach wrote:
willett wrote:
(from prior thread: Re: test program for Extended Pascal strings)
On 13 Nov 2005, at 08:40, Frank Heckenbach wrote:
Then I get a range-check error in this line:
IF x1[j*k..20][1] <> 'k' THEN fail(34);
This is because GPC does not shift the range of string slices to 1..x by default, only in EP mode. Currently we don't have a separate feature options for this, so I tried with --extended-pascal now (removing the non-EP parts). I then get:
Rather than adding a separate feature option, why not make shifting the range of string slices the default behavior? Aren't string slices a feature available only in Extended Pascal implementations, anyway?
GPC also supports string slices in its dialect, and also slices of other arrays. The latter is why we don't shift -- for general arrays there's no natural lower bound, so keeping the original range seems the only reasonable thing to do.(*) Then, if we do so for general arrays, it seems consistent to do so for strings, as otherwise we'd get strange behaviour:
var a: packed array [1 .. 10] of Char; b: packed array [0 .. 10] of Char;
a[5 .. 7] -> range 1 .. 3 according to EP
b[5 .. 7] -> range 5 .. 7 as it's no string
(According to strict EP rules, even the lack of `packed', which makes no implementation difference for Char-arrays in GPC, would have such an effect.)
(*) Perhaps one could think about always shifting to the lower bound of the actual array, which would be EP compatible in the case of strings. But it would also have strange effects, such as adding an element to the lower side of an array (say, changing from 1.. to 0.., would affect the result of slices far apart, both in terms of array elements and code position).
It may be argued to abandon non-string slice accesses altogether. So far, in most cases where they seemed useful, they actually weren't, e.g. due to strict type-checking, and to actually make them useful, we'd need to do more work, such as special type-compatibility rules, as have recently been discussed.
OTOH, array slices may be the only syntactically clean way (so far) to copy or fill parts of an array (instead of Move and FillChar), so there may be merit in having them (with special type-checking rules) ...
Frank
Much more complicated than I would have expected. I can see why gpc works the way it does now. My concern is that a runtime incompatibility with a language standard makes porting code more error-prone and leads to unexpected bugs that may be discovered only late in the development process ("late" as in, when users already have your product). Thus, IMHO, a fairly high priority among the factors above to weigh is that array slice bounds not differ among implementations, and not differ when the -extendedpascal option is turned on and off.
Thoughts on options:
1. Having the array slice always shift to the lower bound, as you suggest is not unreasonable. It is comparable to having an array slice be the "base type" of the expression. (My coding intuition, by the way, is that if I change the lower bound of an array, I'm going to have to look carefully at all my code accessing that array, anyway.)
2. Having the array slice shift only for true strings, appears less desirable from my limited perspective. This might be preferable to the current situation because it does not introduce runtime incompatibilities across systems. If the confusion you note is a problem, perhaps a warning could be issued for arrays of char that are not strings.
3. You make a strong case for retaining non-string slice accesses.
Hope this is helpful in thinking through what the best design is.
Willett Kempton Visible Software
On 23 Sep 2005 at 17:45, Florian Klaempfl wrote:
[...]
I know Free Pascal very well. It comes with a lot of extensions, is fast and produces small binaries. But when you want to program for different platforms, you'll end up with lots of {$IfDef}s.
This is one of my main criticisms -- putting the burden of making the code portable on the programmer, rather than designing the system (as much as possible) portable by itself.
Huh? That's new to me. We try to design everything as portable as possible even taking care of non mainstream platforms.
I think what he might have been referring to is the fact that, with FPC, different platforms seem to have their own version of many units (in platform-specific sub-directories), whereas, with GPC, there is only one version of each of the standard units, and that one version is fully portable.
The other one is the omission of standard Pascal features. (I'm not so much opposed to extensions, IMHO omissions are much worse.) And
from the discussions I've had/seen, this is not accidental, but
fully intentional, as there seems to be some kind of hatred and ignorance against the standards among the developers.
Well, FPC is made by some community which contributes patches and it doesn't seem so that somebody contribute a patch for a ansi mode so far which is probably useless anyways as long as the PVS isn't free in the gnu sense.
It's not useless for someone who wants/needs a compiler that is standards-compliant. Such a person might not have the expertise to patch the compiler - and those who have the expertise to do so (e.g., the development team) might not be interested to do so. Therefore, the person who needs a standards-compliant compiler has to look elsewhere.
Anyways, why should somebody care? If you want a compiler trying to be iso standard compliant: use GPC; if you want a compiler trying to be Borland compatible: use FPC.
Why not have both - standards and Borland compatibility? It is one thing if the FPC developers do not have the ability to add standards compliance. It is also one thing if adding it would be very hard to achieve. But if the ability is there, and it is not difficult to do, then why not do it? I don't think it will harm FPC at all, if it had an iso compatibility mode - even if only turned on by a switch, or even if only by supplying all the Borland omissions. That way, people can choose between GPC and FPC on the basis of things such as the quality of the generated code, rather than first decide whether they want Borland or iso compatibility and then get stuck by default with one or the other (just my two pence ...)
Best regards, The Chief -------- Prof. Abimbola A. Olowofoyeku (The African Chief) web: http://www.greatchief.plus.com/