On Fri, Dec 13, 2002 at 11:26:54AM -0500, CBFalconer wrote:
Emil Jerabek wrote:
On Wed, Dec 11, 2002 at 02:47:31PM +0100, Emil Jerabek wrote: [...]
Exactly the same thing happens also with some other variants of the schema Foo definition:
Sorry, this should be "Bar", not "Foo", of course. I've got confused myself by those stupid names ...
Bar (U: Cardinal) = array [0 .. U] of Char; Bar (L: Integer; U: Cardinal) = array [L .. U] of Char;
^^^^^^
These are different types, thus not a legal specification. Or am I missing the point? (which may be that it causes a crash).
This is arguable, different _integer_ types are compatible in many situations, where type identity is otherwise required. Cardinal is nonstandard, and exact rules for mixing of Integer and Cardinal are not stated anywhere in the GPC docs. But I understand there is a good reason not to allow the example above (another possibility would be to promote L and U to LongInt).
Anyway, the point was that the compiler should _never_ crash, even if given an invalid input.
Finally, a technical issue: I assume your reply was intended for the list. If it wasn't, I apologize for making it public. If it was, please make sure that you use a "List Reply" or "Reply All" or something similar, when replying to the list. (Or is it just a bug in your MUA?)
Emil
-- Chuck F (cbfalconer@yahoo.com) (cbfalconer@worldnet.att.net) Available for consulting/temporary embedded and systems. http://cbfalconer.home.att.net USE worldnet address!
Emil Jerabek wrote:
On Fri, Dec 13, 2002 at 11:26:54AM -0500, CBFalconer wrote:
On Wed, Dec 11, 2002 at 02:47:31PM +0100, Emil Jerabek wrote:
Bar (L: Integer; U: Cardinal) = array [L .. U] of Char;
^^^^^^
These are different types, thus not a legal specification. Or am I missing the point? (which may be that it causes a crash).
This is arguable, different _integer_ types are compatible in many situations, where type identity is otherwise required. Cardinal is nonstandard, and exact rules for mixing of Integer and Cardinal are not stated anywhere in the GPC docs.
I tend to consider them as subranges of a fictitious largest integer type. (Fictitous since there's no type that contains both LongestInt and LongestCard. In fact this is often a real problem in such situations. So I can somewhat understand Borland for providing (at least in BP) only a signed type of the largest size. OTOH, the unsigned largest type is sometimes needed, so I guess we have to accept the trouble ...)
But I understand there is a good reason not to allow the example above (another possibility would be to promote L and U to LongInt).
I'm doing the latter now to fix the problem.
Anyway, the point was that the compiler should _never_ crash, even if given an invalid input.
That's true in any way.
Frank