Mirsad Todorovac wrote:
... snip ...
However, SPARSE SETs seem to me like an excellent challenge for a programmer, but they *do* introduce new issues, for example memory allocation problems - since their size can't be computed at compile time since it depends on actual # of set members; and so on and so on ...
Frank, what do you think of these thoughts of mine? Anybody?
I am a crusty old conservative when it comes to Pascal. I don't think it should be extended to do everything.
This is an implementation detail, so not an extension. I however agree that the features suggested (notable sparse sets) are somewhat over the top.
I don't even believe in sets of over 256 items. The code generation should be kept simple and accurate.
I do. Problem is that otherwise code that relies on set of an enumeration type doesn't scale too well. Tokenizers, parsers and symboltables often use this.
The simple settype should get dynamic beyond a certain size (support can probably get created on top of schemata for extended Pascal compilers, en on dynamic arrays for e.g. FPC).
The sethandling will get slower than, if it goes beyond 256 elements, and binary writing them to files will be different, but at least you don't have to redesign your working program.
If you want a sparse set you can build it yourself, probably out of linked lists of subsets. If you really want large sets an array of subsets is quite adequate.
Agree.
These are NOT items you use every day. However, SET OF ['0'..'9'] is.
Agree too, but as said, I would like to be able to go beyond 256, otherwise for some applications one would have to avoid sets, because it doesn't scale beyond 256 chars.
Also charsets that are larger than 256 chars are rapidly getting more important, so even for chars the 256 elements limits could get problematic.