Hi everybody,
since GPC development has mostly stalled, I've thought about if and how its development could be continued. Here are my current thoughts about it. Since the text is quite long, I put it on the web: http://fjf.gnu.de/gpc-future.html
As I write there, I don't see much future in the current way of developing GPC, but also alternative development models will not be a task for a single person. In other words, without some new contributors, there is probably no chance for GPC development to continue.
I don't really know how many of you currently use GPC, and to what extent and in which ways, e.g., do you use it just to maintain some legacy code, or are you actively writing new applications?
So in order to tell whether continuing GPC development is worthwhile, I'd like to know who of you would actually care about major new features in GPC (as opposed to just preserving the status quo), and who would be interested not only in using GPC, but also supporting its continued development, either by actively contributing to it, or -- perhaps in the case of companies that use GPC -- by paying for continued development.
If it turns out there is no sufficient amount of interest, I'm afraid to say it's probably better to put an end to it now rather than further dragging along. (Of course, the existing GPC versions will continue to be available, and anyone who wants to can use and modify them, which the GPL already guarantees, but without prospects for the future, I would then retire from GPC development and start to rewrite my own Pascal programs in other languages.)
Frank
Hi!
Am 27.07.2010 00:16, schrieb Frank Heckenbach:
[...] I don't see much future in the current way of developing GPC [...]
Me too. A language, wich might not be up to date regardings its features might be used if there is some infrastructure for tasks, eg processing XML, using some high level GUIs and much more. As it seems, there is no one who would like to do this job besides development of GPC itself. Other language groups have all of this, the new ones like "D" and the old ones like "Ada".
For me personally, I don't feel the need for a p2c++ translator as this becomes "just another Pascal dialect".
Eike
Dear Frank,
I use GPC for all my compiled code, and have no plans to stop using it. I far prefer Pascal to C. I will do whatever I can to keep GPC alive. I am a big fan of the schema types of GPC, as compared to the implementation in Borland or FPC.
I'm not an expert at C programming, so if the compiler is written in C, I'm not going to be much good with it. But if the compiler is written in Pascal, and compiles itself, I will be very happy to start trying to host the development of GPC myself, using some sort of version control code like GIT. I usually work on MacOS, so I could cover the interface with the MacOS operating system. We would need other people to work on the interface with the other operating systems.
I could dedicate 2 hours a week to maintenance of GPC, plus I have static IP addresses for hosting the code. At the very least, I could maintain the existing versions.
Yours, Kevan
Hello,
One suggestion: Why not implement a syntax mode in Free Pascal which is compatible with the current GPC syntax?
That would merge both projects and avoid such trouble.
If there is no volunteer to do this work, but someone interrested in it's conclusion has some money (at least 200 eur for the doing the gross initial work, more later for the fine tuning), I could try to find an implementor for it in the context of the Lazarus Season of Code: http://wiki.freepascal.org/Lazarus_Season_of_Code
bye,
On Mon, 26 Jul 2010, Frank Heckenbach wrote:
Hi everybody,
since GPC development has mostly stalled, I've thought about if and how its development could be continued. Here are my current thoughts about it. Since the text is quite long, I put it on the web: http://fjf.gnu.de/gpc-future.html
As I write there, I don't see much future in the current way of developing GPC, but also alternative development models will not be a task for a single person. In other words, without some new contributors, there is probably no chance for GPC development to continue.
I don't really know how many of you currently use GPC, and to what extent and in which ways, e.g., do you use it just to maintain some legacy code, or are you actively writing new applications?
So in order to tell whether continuing GPC development is worthwhile, I'd like to know who of you would actually care about major new features in GPC (as opposed to just preserving the status quo), and who would be interested not only in using GPC, but also supporting its continued development, either by actively contributing to it, or -- perhaps in the case of companies that use GPC -- by paying for continued development.
I know from experience that Frank's return address bounces what's sent to it, and I don't see any other address to respond to posted, so I guess I have to do it here.
I don't use Pascal in my professional work (sorry to say), but it is my preferred language for personal projects, and despite my former Borland cultism, I have great appreciation for the Extended Pascal standard, which makes GPC my preferred Pascal compiler. I find it interesting that when I was a young CS major back in the 1980s, I somewhat resented Pascal's status as the "official religion" in my university's CS department, but as a professional programmer, my appreciation for the way it encourages good programming practices has increased by leaps and bounds. At the same time, my attitude toward C has gone from infatuation to active dislike. For all of these reasons, I'd hate to see GPC development cease. I could use FPC, but I definitely like GPC better.
That said, I agree in principle with the proposal to rewrite GPC in Pascal, though if it can be made easy to keep the existing GPC in sync with the GCC back end, it would free up time for the core developers to work on the rewrite. I'm not sure how much language extension is required, but Pascal greatly suffers from a lack of bindings to widely used libraries, including and especially GUI toolkits. I know there is a tool to facilitate creation of Pascal bindings to C/C++ libraries, but it's not (as far as I can see) distributed with GPC, or mentioned on the GPC website. I don't really understand why the existing GPC couldn't be used to boostrap a new compiler written entirely in Pascal, but perhaps Frank could elaborate on why his suggested C++ converter is preferable as an interim step.
I don't have any experience in compiler development, know almost nothing about GCC internals, and have very little free time, so I don't know how useful I could be to the project, but if an updated list of things that need doing could be published, it would likely be easier for me and others like me to get involved.
Again, I greatly appreciate the work that the GPC developers have done over the years and would hate to see it stop. Thanks to Frank for sharing his ideas.
--------------------------| John L. Ries | Salford Systems | Phone: (619)543-8880 x107 | or (435)867-8885 | --------------------------|
On Tue, 2010-07-27 at 00:16 +0200, Frank Heckenbach wrote:
Hi everybody,
since GPC development has mostly stalled, I've thought about if and how its development could be continued. Here are my current thoughts about it. Since the text is quite long, I put it on the web: http://fjf.gnu.de/gpc-future.html
[...]
I still use GPC regularly when I need portable code for command-line programs (Windows and Linux). I would use it in preference to C for my embedded programming (firmware stuff for Texas Instruments AR7 ADSL modem/routers) if not for the huge executable sizes.
I am not that great with low level or complex compiler stuff, but I would be willing to help with the development in whatever way I can (as long as I am not in over my head).
My one concern would be about producing yet another Pascal object model that is not compatible with any of the existing ones. But I guess that, if the old BP model is still supported so that existing code would compile, that would be fine.
Hi Frank,
On Tue, 2010-07-27 at 00:16 +0200, Frank Heckenbach wrote:
Hi everybody,
since GPC development has mostly stalled, I've thought about if and how its development could be continued. Here are my current thoughts about it. Since the text is quite long, I put it on the web: http://fjf.gnu.de/gpc-future.html
[...]
I'm one of those who uses GPC for legacy code. Important legacy code, since it's the major code we use in our modelling and data analysis, which I started back in the 80s as a postdoc.
So I am eternally grateful that GPC saved me the trouble of rewriting the code when our main platforms became Linux, after using CDC, Prime, AIX, Sun and DEC machines (and perhaps some others that I've forgotten...)
My code is mostly generic Pascal, but I do interface with the Lapack libraries and some of the C file-handling functions. In fact, I mostly use an ancient Linux version of the compiler, since I have no need for developments of the language itself.
The translation idea sounds like a robust way of maintaining the Pascal capability and it offers the interesting possibility of running code on systems that only have C++ compilers installed by shipping the translated code rather than trying to install a Pascal compiler. If one wants to run on a supercomputer, for example, one may have limited compiler options. Even for generic Intel machines, one might want to recompile code optimised for the particular platform.
Mike
This email may be confidential and subject to legal privilege, it may not reflect the views of the University of Canterbury, and it is not guaranteed to be virus free. If you are not an intended recipient, please notify the sender immediately and erase all copies of the message and any attachments.
Please refer to http://www.canterbury.ac.nz/emaildisclaimer for more information.
Mike Reid wrote:
I'm one of those who uses GPC for legacy code. Important legacy code, since it's the major code we use in our modelling and data analysis, which I started back in the 80s as a postdoc.
So I am eternally grateful that GPC saved me the trouble of rewriting the code when our main platforms became Linux, after using CDC, Prime, AIX, Sun and DEC machines (and perhaps some others that I've forgotten...)
:-)
My code is mostly generic Pascal, but I do interface with the Lapack libraries and some of the C file-handling functions. In fact, I mostly use an ancient Linux version of the compiler, since I have no need for developments of the language itself.
Of course, these ancient versions continue to be available, though it will probably get harder to use them over time with each upgrade.
FWIW, I recently set up a Debian squeeze testing system for other projects and also tried the GPC package in it (based on gcc-4.1). After I found that it doesn't work satisfactory for my purposes (see my article), I just installed the package from etch (2 major releases earlier), the last one based on gcc-3. It installed without any problems or dependency conflicts and works well, as far as I've tested so far. So at least for the lifetime of squeeze (another few years), there will be a working GPC easily available under Debian Linux, and if GPC development doesn't continue, I will use this time to port my programs without hurry.
Rugxulo wrote:
GPC's strengths are its multi-platform, cross-compilable base with good optimizations and support for lots of dialects (and good docs and examples, IMHO). The downside is few developers (so?),
... so development stalls. :-(
relying on semi-outdated GCC versions (so?),
... so much effort is required which there's noone there to do. :-(
somewhat hard to bootstrap on non-*nix (so?),
Maybe not so big a problem, since The Chief, Maurice and Adriaan have provided binaries for the main three non-Unix platforms.
big runtimes and thus big .EXEs (so?),
Not a problem for most cases, sometimes a problem for embedded systems etc. and for people with a 1980s mindset ("oh shock, a few megabytes executable"). ;-)
lacking the latest / greatest from Delphi (so?),
Ask some Delphi developers. ;-) For me, the lack of other features is more pressing, as I wrote.
and lacking a public CVS/SVN repo for current sources (not needed by me, but ...).
We once had one, it didn't change much in our development, except causing extra work for us to maintain it. It was said back then how important it was, but the simple truth is that any infrastructure problems (including the build process, which takes some steps to set up, but that's it) are minor compared to the actual effort of getting fluent with GPC's internals, so none of the infrastructure initiatives did anything to get us more developers (for the actual compiler).
John Gordon Ollason wrote:
Without a thriving population of young programmers the language will die, and I suspect that the formal virtues of a language in the ALGOL tradition are not being transmitted, and perhaps new programmers are not being equipped to recognize the admirable qualities of Pascal.
I don't want to get too philosophical, but I think a reason for its decline was that in several ways it just was too strict. E.g., while I dislike "goto" as much as Dijkstra did, I'm not so much opposed to "Exit" (which some dismiss as a disguised goto, and ISO Pascal doesn't have). Quite often, a routine needs to return early based on some initial checks. The alternative, wrapping the whole body in an if-block, though syntactically cleaner, just doesn't help readability, neither when tracing the "Exit" case (look over the whole if-block to find out that it extends until the end of the routine), or the normal case (look over the whole if-block to make sure no statments appear after it that would be executed in both cases), and last (and least) of all, adding more indentation levels.
Similar for the I/O system in ISO Pascal which is so simplistic that it might be suitable for simple (I hate to say, academic) cases, but anything that requires more control is outside of the scope, thus inviting diverging extensions. (In fact, it might have been better to leave I/O completely out of the language, like C did, and let library authors develop it -- except, of course, that the original language didn't even have libraries or modules in the first place, a big shortcoming in itself.)
Same with the lack of an official, and therefore standardized way, to interface with foreign-language libraries, a common necessity in real-world programs. (Sure, you can reinvent every wheel, i.e. reimplement every library in Pascal, but that's not productive.)
So that's perhaps why BP was popular under Dos, because it was one fixed dialect (so diverging extensions, though massively present, and their long-time consequences, were not known to the majority of programmers), I/O was extended to be at least suitable for Dos (though it maps less well to other systems which were ignored by most Dos programmers) and supported modules (units) which allowed some other needed facilities (such as CRT) to be supplied. But all of this was too short-sighted: We now have a mess of dialects; Dos-style I/O is too limited on modern systems; even CRT (one of the least bad designed BP units IMHO) wasn't as lasting as its C roughly-counterpart curses). So it's no surprise that BP's popularity sharply declines with Dos's. And under Windows, I'm not an expert, but ISTM that Delphi's decline is largely due to MS's pushing their own languages -- which is, of course, always a natural risk for anyone targeting Windows exclusively).
I suggest, therefore, that no substantial development of GPC would be cost effective unless there were a large base of users of the language, and that the first priority should be to determine the number of current users of the language and to estimate its expected future use. My own pessimistic assessment may be mistaken; I hope it is, but it would make sense to find out before committing a substantial amount of effort in further development of a language that is only used by a small minority of programmers.
I agree with you here. Indeed, one of the purposes of my posting was to determine the number of current users of the language and to estimate its expected future use. But I'm also still pessimistic that a rewrite is really worthwhile, also because, as far as I've gathered so far, the current GPC users seem to be rather diverse (WRT dialects, platforms and features used etc.), which also doesn't bode well for a new project.
Don't get me wrong: I don't regret the time I spent with GPC, even if it was to die now. It was an interesting experience, learning a lot not only about the various Pascal dialects, but also about compiler construction in general (which has benefitted me in other projects since then), and since I did professional work with it, it was even paid. :-) And while it was at it, it was only natural to write my hobby projects in it as well. But as with many things, their time comes and it goes, and it may be its time has gone now and maintaining a compiler only for my hobby projects is just not efficient.
Frank
Hi, (eek, this got longer than I wanted)
On 7/30/10, Frank Heckenbach ih8mj@fjf.gnu.de wrote:
Rugxulo wrote:
relying on semi-outdated GCC versions (so?),
... so much effort is required which there's noone there to do. :-(
In other words, bugs that you yourself can't fix? I don't have any huge complaints with GCC 3.4.4, that's why I'm wondering why it deathly matters to have latest and greatest GCC support (besides the obligatory "it would be nice", e.g. Atom support or plugins in 4.5.0).
big runtimes and thus big .EXEs (so?),
Not a problem for most cases, sometimes a problem for embedded systems etc. and for people with a 1980s mindset ("oh shock, a few megabytes executable"). ;-)
It's more of when you write a very simple tool that takes 300 KiB. It just feels wrong, esp. from an "assembly" mindset or if you're spoiled by smartlinkers in other Pascal compilers. I'll admit, Frank, to be frank, ;-) in real life it doesn't matter in 99% of cases, but I do think libgpc.a could be modularized a bit better. (No, I haven't looked closely yet, my bad! And BTW, I strongly suspect GNU ld's --gc-sections doesn't work with COFF, blech.) I mean, no offense, but when a combined DOS 8086 (TP55) + Win32 .EXE (VP21) takes 24 KiB uncompressed .... ;-)
I don't want to get too philosophical, but I think a reason for its decline was that in several ways it just was too strict. E.g., while I dislike "goto" as much as Dijkstra did, I'm not so much opposed to "Exit" (which some dismiss as a disguised goto, and ISO Pascal doesn't have).
"Exit" as in "break" out of loop? I don't even think TP had it until v6 or (more likely) v7. Also note that some languages (Oberon, Java) don't support goto at all !! And FPC only handles local gotos.
(In fact, it might have been better to leave I/O completely out of the language, like C did, and let library authors develop it
Didn't Modula-2 do that, much to many people's chagrin??
Same with the lack of an official, and therefore standardized way, to interface with foreign-language libraries, a common necessity in real-world programs. (Sure, you can reinvent every wheel, i.e. reimplement every library in Pascal, but that's not productive.)
Ada has Interface (or whatever it's called). Doesn't matter anyways as there are so many competing formats (ELF, COFF, Mach-o) and linkers that work in varying degrees and different ABIs. (Agner's ObjConv potentially helps here, but I've never heavily used it.)
So that's perhaps why BP was popular under Dos, because it was one fixed dialect (so diverging extensions, though massively present, and their long-time consequences, were not known to the majority of programmers),
Except it also extended itself several times! So code that works for TP55 (objects, units) won't work in TP3, nor code in TP6 (inline asm). Plus bugs and heavy 16-bitisms. Doesn't mean lots of good stuff wasn't written in TP/BP (e.g. Chasm: The Rift), but most of that old code is pretty unmaintainable without the exact same compiler version (ahem, TPU incompatibilities).
I/O was extended to be at least suitable for Dos (though it maps less well to other systems which were ignored by most Dos programmers)
Since TP didn't run on anything else (CP/M dropped after v3 and only two TPW releases), that's no surprise, esp. since they never fully supported ISO 7185 or 10206. It's hard to be portable when you ignore standards. And yet most compilers nowadays (even on "modern" OSes, heh) try to emulate BP-style, oddly enough.
and supported modules (units) which allowed some other needed facilities (such as CRT) to be supplied. But all of this was too short-sighted:
Everything in computers is short-sighted. We're constantly being bit by it.
We now have a mess of dialects;
No worse than all those silly Romance languages. ;-)
Dos-style I/O is too limited on modern systems; even CRT (one of the least bad designed BP units IMHO) wasn't as lasting as its C roughly-counterpart curses).
C didn't have curses built-in anyways. In fact, it left a lot out, hence POSIX (which I guess has its own dialects, e.g. 2008). Nobody bothers with pure ANSI C anymore (sadly), which is more painful when using non-GCC compilers (like OpenWatcom).
So it's no surprise that BP's popularity sharply declines with Dos's.
Don't forget that they "dropped" DOS support! BP7 was the last one (1992?). That's the whole (initial) reason for FPC's existence! Even Delphi 1 (Win3x, 1995) was the only 16-bit Windows version. I don't know if their latest (14?) is 64-bit capable yet.
And under Windows, I'm not an expert, but ISTM that Delphi's decline is largely due to MS's pushing their own languages -- which is, of course, always a natural risk for anyone targeting Windows exclusively).
Borland spun off Code Gear a few years ago, and that was later bought by (current owner) Embarcadero (who, uncoincidentally, killed the "free" Turbo Explorer 2006 line, which C++ version never installed correctly for me under Vista anyways [.NET? *spit*], MarcoV says Turbo Delphi worked though).
Targeting Windows exclusively? Ick, yeah, people do that. Not that huge a deal (big install base) but kinda a hassle (too many .NET frameworks, 64-bit borg assimilation looming, lots of bugs). People love their C#, though.
I agree that Windows isn't the best platform to target. Heck, I know I'm on unsympathetic ears here, but I often think even Linux is too much of a moving target sometimes. (Don't get me started on Mac OS X, they deprecate everything too fast. *shudder*)
the current GPC users seem to be rather diverse (WRT dialects, platforms and features used etc.), which also doesn't bode well for a new project.
Like I said, your favorite dialect seems to be ISO 10206. (I saw you praising it in some old mail in list archive.) So you should probably just use that as a testbed for your C++ idea. It seems to have a fair bit in common with a few TP extensions, even, and (I think) it's a strict superset of ISO 7185 anyways. So everybody should be happy (famous last words)! ;-)
Don't get me wrong: I don't regret the time I spent with GPC, even if it was to die now.
Even DOS (FreeDOS), BeOS (Haiku), and OS/2 (eCS) aren't dead yet! Keep hope alive! ;-)
It was an interesting experience, learning a lot not only about the various Pascal dialects, but also about compiler construction in general (which has benefitted me in other projects since then), and since I did professional work with it, it was even paid. :-) And while it was at it, it was only natural to write my hobby projects in it as well. But as with many things, their time comes and it goes, and it may be its time has gone now and maintaining a compiler only for my hobby projects is just not efficient.
Yeah, computers can suck a lot of time out of you. Oddly enough, we spend so much time just to save more time later. But yeah, it's fun, so we (or at least I) don't mind much. ;-)
Frank Heckenbach wrote:
Hi everybody,
since GPC development has mostly stalled, I've thought about if and how its development could be continued. Here are my current thoughts about it. Since the text is quite long, I put it on the web: http://fjf.gnu.de/gpc-future.html
My 2 cents:
GPC has been a good project, and backed the standard when few others did. Although I don't think you can assume that FPC is that much better (they regularly complain about lack of developers), it seems that at least some of the following factors helped:
1. Complete compatibility with the Borland dielect.
2. Having the compiler itself written in its own language (Borland Delphi).
Although a rewrite of GPC in Pascal is interesting, I don't think it would help. It is a huge undertaking, and it still would not result in an all-Pascal project, since it must integrate with the GCC backend.
One suggestion I could make is that you might think about taking your knowledge and expertise into the FPC project. Just as GPC was a ISO 7185 compiler that added Borland compatibility, there is no reason why GPC elements such as ISO 7185 and ISO 10206 compatibility cannot be integrated into FPC. In fact, the FPC group has talked about ISO 7185 compatibility for quite some time now, they just don't see it as a priority. Most of the differences between FPC and ISO 7185 are covered by simple addition of missing features, and the few that are mutually exclusive between ISO 7185 and Borland dielects can be covered by a compiler option.
My agenda, for anyone who knows me, is to see the original Pascal standard spread (as created by Wirth). However, I think the above comments make sense even if you are not a backer of original Pascal.
Just a thought.
Scott Moore http://www.standardpascal.org
Dear Frank,
I have now read the report in full, and enjoyed it. Thank you for your explanation of how GPC works.
Re: Why Continue Using Pascal?
I want to continue using Pascal. In fact, I can't see myself using another language. I can't send my non-engineering colleages snippets of C code. They can't make head or tail of C code. But I can extract sections of Pascal directly from my source code and put it in my manuals, and they can see how it works right away. So I'm sticking with Pascal. It saves me a lot of time and effort.
Re: Graphical User Interfaces with Pascal.
I don't see any sense in using compiled code for graphical user interfaces when we have interpreters like TclTk and others that provide graphical user interafaces based upon scripts that can be run on any platform. There are hundreds of people working on these interpreters, and there is no way we can compete with their product, nor any reason to.
I use Pascal for computationally intensive operations upon objects in memory. I pass the results of image analysis, list sorts, or pattern construction to TclTk for rendering on the screen. It works fine. I have had no particular trouble linking to the TclTk libraries, which are compiled from C.
When I don't need graphics, I often use Pascal on its own for terminal input and output and file input and output. I use the microsecond system timer (althought it does not work in GPC on Windows) and the random number generator.
When it comes to debugging, I am always able to get by with writeln statements and looking at the console on Linux or MacOS. On Windows GPC has no particular place to write to, so I have more trouble there. But I place the debugger as a low-priority item.
Thus, I see no need for Pascal to provide more than its existing rudimentary interface with the local operating system (although the Windows version should be brought up to the level of the others).
Re: Self-Compiled GPC or Translation to C++
If the GPC compiler were written in GPC, I could work on fixing bugs. I would be happy to write or maintain the code that implements the mathematical run-time library (sin, cos, etc.). I understand that the self-compiling compiler requires a complete re-write. But if you could break the problem down into twenty sections, I expect you could find enough of us to take on most of the work and do it in a reasonable time.
As you seem to say in your report, the problem with a compiler written in C for Pascal users is that most of the people using Pascal are the very same people who loath writing C code. If they did not loath C, they would be writing C instead of being branded as weirdos for writing in Pascal.
Whatever you decide to do, I will support you to the extent that I am able. Thank you for all your work on GPC. It is a splendid complier.
Yours, Kevan
Hi,
On 7/26/10, Frank Heckenbach ih8mj@fjf.gnu.de wrote:
since GPC development has mostly stalled, I've thought about if and how its development could be continued.
Well, I'm a Pascal noob, barely learned a subset of it (off and on) since January. I'm also honestly not a very good programmer nor very experienced in *nix or C/C++. And to make matters worse, I like DOS (a lot)! ;-) But anyways, here's some random comments from me:
<rant>
GPC's strengths are its multi-platform, cross-compilable base with good optimizations and support for lots of dialects (and good docs and examples, IMHO). The downside is few developers (so?), relying on semi-outdated GCC versions (so?), somewhat hard to bootstrap on non-*nix (so?), big runtimes and thus big .EXEs (so?), lacking the latest / greatest from Delphi (so?), and lacking a public CVS/SVN repo for current sources (not needed by me, but ...).
Written in C shouldn't be a downside in theory, but I'll admit that C leaves some things to be desired and can be arcane (although I don't know if I think C++ is much better, syntax-wise). Heck, if you think C++ is better, just rewrite GPC in C++ ! It's interesting that Go and SPECS (C++) both took some syntax from Pascal due to inherent advantages.
Also, as we all know, FPC ain't too shabby either. It's main strengths are little GNU utils reliance (and none on Win32, uses own built-in assembler + linker), no libc needed, compiles itself. But it also has very few developers and still lacks support for ISOs 7185 or 10206. They follow Delphi very heavily (generics, ANSI string, Unicode, exceptions). To be honest, it's a little surprising that BP-style Pascal is so common and preferred. Esp. for ISO 10206, GPC seems like the only reasonable choice. (Seems Prospero/Win32 is freeware now, but it didn't install the "license" correctly for me on XP, yuk.)
I've made a small list (meaning to eventually post to comp.lang.misc) of various language translators. I too was wondering recently if it's the "way of the future." It would indeed solve a lot of problems (in theory). Of course, be aware that there will always be backend bugs, secondary platforms that don't get bugfixes or have old tools, etc. So even if you intend to support "standard" C++, there will always be problems, even on the "big three" OSes (which I hope you plan to extend beyond). In other words, some things which "should" work everywhere don't. Don't expect any miracles. (Try visiting OS News sometime to broaden your horizon.) ;-)
Long story short, there's already a BP-ish Pascal to C++ (but also some C sometimes) converter! :-)
http://www.garret.ru/pascal.html
(ignore weird comments about "shareware", it's free with sources)
Apparently there are also TPTC (old, very buggy), PTC (written in itself, very old, GPC dislikes it), and of course P2C (which GNU has on their retired/deprecated project list; of course, 1.20 is from 1991, and unfinished 1.21 from 1993, which I *think* had some partial Object Pascal support -> C++.)
It might be easier to just initially target one dialect (10206? isn't that your favorite?) as a testbed. Of course that's sorely overlooked. You'd get more support from users by targeting BP or Delphi, but those are weird in their own ways (various levels).
P.S. My really horrible Befunge-93 interpreter compiles via P2C with BCC/Dev86, which is the only way I can easily get it to build for ELKS (which is dead and I've never heavily used, heh, go figure). So things like that are useful! Still, I may just finally (also) convert it to C manually (already tried and failed twice) for smaller size. Translators are also good when one compiler backend has limitations (e.g. Minix 2.0.2 16-bit, pc's "integer" is 16-bit and I don't see any way of making it 32-bit except by using cc and "long").
</rant>
Rugxulo wrote:
Long story short, there's already a BP-ish Pascal to C++ (but also some C sometimes) converter! :-)
Tried it recently. Broken.
Scott Moore
Am 28.07.2010 06:20, schrieb Rugxulo:
Also, as we all know, FPC ain't too shabby either.
[...]
still lacks support for ISOs 7185 or 10206.
Well, there was no need to do so for the last 15 years: GPC supported ISO pascal FPC supported Borland style pascal Why reinventing the wheel and cannibalize each other?
Hi,
we're using GPC in our production environment on AIX for mathematical computations since IBM has stopped developing their Pascal compiler. For our Windows GUIs we're using Delphi with a single source approach when it comes to computations. On AIX we're also accessing our DB2 databases with GPC with a self written wrapper around IBM's DB2 API...
The only feature I'm really missing are exceptions. It's really hard to work with databases without them.
I would gladly volunteer as a tester for new features.
Marcus
Am 27.07.2010 00:16, schrieb Frank Heckenbach:
Hi everybody,
since GPC development has mostly stalled, I've thought about if and how its development could be continued. Here are my current thoughts about it. Since the text is quite long, I put it on the web: http://fjf.gnu.de/gpc-future.html
As I write there, I don't see much future in the current way of developing GPC, but also alternative development models will not be a task for a single person. In other words, without some new contributors, there is probably no chance for GPC development to continue.
I don't really know how many of you currently use GPC, and to what extent and in which ways, e.g., do you use it just to maintain some legacy code, or are you actively writing new applications?
So in order to tell whether continuing GPC development is worthwhile, I'd like to know who of you would actually care about major new features in GPC (as opposed to just preserving the status quo), and who would be interested not only in using GPC, but also supporting its continued development, either by actively contributing to it, or -- perhaps in the case of companies that use GPC -- by paying for continued development.
If it turns out there is no sufficient amount of interest, I'm afraid to say it's probably better to put an end to it now rather than further dragging along. (Of course, the existing GPC versions will continue to be available, and anyone who wants to can use and modify them, which the GPL already guarantees, but without prospects for the future, I would then retire from GPC development and start to rewrite my own Pascal programs in other languages.)
Frank
Frank Heckenbach wrote:
since GPC development has mostly stalled, I've thought about if and how its development could be continued. Here are my current thoughts about it. Since the text is quite long, I put it on the web: http://fjf.gnu.de/gpc-future.html
As I write there, I don't see much future in the current way of developing GPC, but also alternative development models will not be a task for a single person. In other words, without some new contributors, there is probably no chance for GPC development to continue.
I don't really know how many of you currently use GPC, and to what extent and in which ways, e.g., do you use it just to maintain some legacy code, or are you actively writing new applications?
So in order to tell whether continuing GPC development is worthwhile, I'd like to know who of you would actually care about major new features in GPC (as opposed to just preserving the status quo), and who would be interested not only in using GPC, but also supporting its continued development, either by actively contributing to it, or -- perhaps in the case of companies that use GPC -- by paying for continued development.
The devil is in the details. For me, a compiler is a professional tool and the quality of the tool is more important than new features. A handful of weak points may become serious deployment obstacles.
Current GPC problems: 1. wrong line numbers in debug code 2. wrong and/or incomplete debug-symbol info, notably for 64-bit code 3. order-2 performance of unit symbol loading 4. order-2 performance of operator overloading symbols http://www2.gnu-pascal.de/crystal/gpc/en/mail12897.html 5. stalled GCC back-end development/integration (a show stopper when the target OS requires a recent GCC)
I am not complaining or opposing new features. I am saying that a future path for GPC must tackle these and similar problems. Questions must be answered. Does (5) imply moving away from the low-level GCC back-end ? What are the alternatives ?
1. producing C/C++/Objective-C. What about the debugging experience ? What about GDB maintenance ? What about targeting the LLVM debugger ? 2. producing LLVM assembly http://llvm.org/docs/LangRef.html 3. LLVM integration http://llvm.org/docs/tutorial/ (also a moving target with a moving C++ API) 4. adding ISO and GPC language modes for FPC. This would be an opprtunity to redesign (parts of) the FPC parser (for example to get rid of nasty code implemented as side-effect of type-checking and operator/function overloading)
Anyway, I can help if the compiler is written in Pascal.
Regards,
Adriaan van Os
Thanks for your replies so far. Here are some comments as a summary reply to several of your mails, in no particular order.
Kevan Hashemi wrote:
But if the compiler is written in Pascal, and compiles itself, I will be very happy to start trying to host the development of GPC myself, using some sort of version control code like GIT.
plus I have static IP addresses for hosting the code. At the very least, I could maintain the existing versions.
Thanks for the offer, but unfortunately that's not my main concern. Our server (where the web site is currently hosted) has static IP addresses and more than enough free bandwidth, and we could probably set up git easily (or even go to Sourceforge etc. if this was advantageous), but I'm afraid infrastructure will not write a program. (I'm a burned child here -- with several previous initiatives, including documentation, translation, libraries, and setting up a CVS, there was always much interest in creating the infrastructure (or at least in discussing the infrastructure), but once this was done (or not done), not much actual work happened.)
I usually work on MacOS, so I could cover the interface with the MacOS operating system. We would need other people to work on the interface with the other operating systems.
Also I'm afraid to say that this isn't one of my main concerns. We have runtime libraries, and as I listed under "Reusable parts", they could probably more or less be kept. The actual problem is really the compiler proper.
When it comes to debugging, I am always able to get by with writeln statements and looking at the console on Linux or MacOS. On Windows GPC has no particular place to write to, so I have more trouble there. But I place the debugger as a low-priority item.
Of course, I've also done much writeln-debugging with GPC (for lack of alternatives) and sometimes recently also in C++ (out of habit ;-), but honestly, having a real debugger (gdb with C++) available is quite helpful sometimes, especially to find more complex bugs. For me, it's at least medium priority.
Thus, I see no need for Pascal to provide more than its existing rudimentary interface with the local operating system (although the Windows version should be brought up to the level of the others).
I also have the need for an extended OS interface (though obviously not Windows, but Unix), but again, that's not my main concern. If that was all, I'd probably write it within a few days and be done.
If the GPC compiler were written in GPC, I could work on fixing bugs. I would be happy to write or maintain the code that implements the mathematical run-time library (sin, cos, etc.). I understand that the self-compiling compiler requires a complete re-write.
Actually, GPC's runtime library is already written in Pascal for the most part (i.e., except for the direct libc interface), so that's just the area that would require the least amount of work.
Felipe Monteiro de Carvalho wrote:
One suggestion: Why not implement a syntax mode in Free Pascal which is compatible with the current GPC syntax?
That would merge both projects and avoid such trouble.
If there is no volunteer to do this work, but someone interrested in it's conclusion has some money (at least 200 eur for the doing the gross initial work, more later for the fine tuning), I could try to find an implementor for it in the context of the Lazarus Season of Code: http://wiki.freepascal.org/Lazarus_Season_of_Code
I'm skeptical given what I heard from FPC advocates WRT standard Pascal etc. before. Not wanting to rehash old and less than pleasant discussions, all I'll say WRT my participation in any project with FPC is: Not interested.
Of course, if you'd like to do this, go ahead -- and good luck!
John L. Ries wrote:
I find it interesting that when I was a young CS major back in the 1980s, I somewhat resented Pascal's status as the "official religion" in my university's CS department, but as a professional programmer, my appreciation for the way it encourages good programming practices has increased by leaps and bounds. At the same time, my attitude toward C has gone from infatuation to active dislike.
Again, I'd prefer to leave out C (which I'd suggest to noone as an alternative for Pascal), and talk about C++ instead which is indeed quite a bit more high-level (though it carries all the ballast from C, but unfortunately so does GPC, often disguised as BP extensions).
Of course, I have gathered most of my programming experience in Pascal and applied it in C++ later, and I don't know how my habits would have developed if I had started in C++.
That said, I agree in principle with the proposal to rewrite GPC in Pascal, though if it can be made easy to keep the existing GPC in sync with the GCC back end, it would free up time for the core developers to work on the rewrite.
Unfortunately, that's just the main problem here. As I wrote, I figure it would take several weeks just to "fix" gpc with gcc-4.1, after Waldek spent months porting it to gcc-4 in the first place. Then additional effort is required to port it to the current gcc, test and fix it, and by then probably new gcc versions will have appeared, possibly with major changes.
And I'm talking about full-time development here. I may have a few weeks available now, but then I'll have to concentrate more on other projects again and only have occasional time for Pascal. The same goes for Waldek. And for new developers, even if there were some, it would take some time to get used to the backend to become able to do any productive work.
So, sorry, I just don't see a way to get, let alone keep, GPC in sync with the backend. I had hoped so when I started, but since then the distance has probably even increased, despite the extensive efforts of Peter, Waldek and me.
I'm not sure how much language extension is required,
"Required" is, of course, subjective here. I tried to explain why something like templates are important to me (and adding templates in the langauge is very much non-trivial), or in other words a way to write a structure (such as a list, hash table or tree) that can be applied to *any* type.
but Pascal greatly suffers from a lack of bindings to widely used libraries, including and especially GUI toolkits. I know there is a tool to facilitate creation of Pascal bindings to C/C++ libraries, but it's not (as far as I can see) distributed with GPC, or mentioned on the GPC website. I don't really understand why the existing GPC couldn't be used to boostrap a new compiler written entirely in Pascal, but perhaps Frank could elaborate on why his suggested C++ converter is preferable as an interim step.
Perhaps I was unclear, but I didn't mean it as an interim step, but as the actual target for the alternative development (i.e., getting away from the backend). An advantage WRT your point would be that the generated C++ code could include the C/C++ headers directly, so interfacing would become easier (though not completely trivial, as I explained in my article).
Scott Moore wrote:
Although a rewrite of GPC in Pascal is interesting, I don't think it would help. It is a huge undertaking, and it still would not result in an all-Pascal project, since it must integrate with the GCC backend.
See my previous paragraph. My point of the rewrite would be that it would *not* integrate with the gcc backend, but output C++ code (which would then probably be compiled with g++ using the gcc backend, but we're not involved with that then).
Prof Abimbola Olowofoyeku (The African Chief) wrote:
My one concern would be about producing yet another Pascal object model that is not compatible with any of the existing ones. But I guess that, if the old BP model is still supported so that existing code would compile, that would be fine.
That's kind of what I half expected and half feared. If I'm the only one to want a new (or at least extended) object model (in particular WRT enforced constructors and automatic destructors) or other features (such as templates), there's no point implementing them.
Adriaan van Os wrote:
The devil is in the details. For me, a compiler is a professional tool and the quality of the tool is more important than new features. A handful of weak points may become serious deployment obstacles.
Current GPC problems:
- wrong line numbers in debug code
- wrong and/or incomplete debug-symbol info, notably for 64-bit code
- order-2 performance of unit symbol loading
- order-2 performance of operator overloading symbols
http://www2.gnu-pascal.de/crystal/gpc/en/mail12897.html 5. stalled GCC back-end development/integration (a show stopper when the target OS requires a recent GCC)
I am not complaining or opposing new features. I am saying that a future path for GPC must tackle these and similar problems. Questions must be answered. Does (5) imply moving away from the low-level GCC back-end ? What are the alternatives ?
As I wrote, the only way I could see WRT (5) as far as I'm concerned would be moving away from the backend, and my alternative would be to output C++. This could also help solving (1) (when being careful about emitting line breaks and/or inserting "#line" directives), and partly (2) (the debug info should at least be useable, but as I noted, in particular types would, at least initially, be in C++ notation).
For (3) and (4) it wouldn't help, but if the new compiler was written in a high-level language (i.e., C++ or its own new Pascal dialect) using high-level data structures (e.g., template based), it would make it easier to tackle those problems.
- producing C/C++/Objective-C. What about the debugging experience ? What about GDB maintenance ?
Debugging plain C++ works well with gdb, even multi-threaded programs. I think at least for the mainstream languages (C and C++) gdb is currently well maintained.
Debugging Pascal translated to C++ would, of course, be inferior, see above, but still it would be a huge step forward from the current state of affairs.
- producing LLVM assembly http://llvm.org/docs/LangRef.html
That might be an option. However, the problems I see are (a) I'm not familiar with LLVM assembly (in contrast to C++), and I'm not sure anyone else here is, so it would take additional learning before one could get productive, and (b) it's, of course, low-level, so we'd have to reimplement things like the object-model, templates, exceptions etc., that C++ already has.
- LLVM integration http://llvm.org/docs/tutorial/ (also a moving target with a moving C++ API)
As you can imagine, I've become skeptical of moving targets.
Frank
Dear Frank,
Thanks for the offer, but unfortunately that's not my main concern.
Okay. I'm slowly figuring out how I can help.
In order to create a GPC that compiles itself, we need run-time libraries that provide OS routines. I assume that every OS provides such libraries independent of GCC's existence or version in the system.
Given that GPC's Pascal is, in your own words, Turing Complete, I assume it is possible to write in the current version of GPC the code to implement any future version of GPC, regardless of the new features. I am not daunted by the difficulties you list in implementing new features, such as destructors. For example, going back and re-writing sections of GPC using its own new features is optional, and if the original code is clear, should be painless.
Thus, re-writing GPC in Pascal seems to me to be a Pascal programming job. If there are enough development hours available, it is an elegant and long-lasting solution to GPC's problems.
I can program in Pascal. I could write a lexer, a parser, or any other part of the program. I will be happy to help out. If I'm starting with C code that needs to be translated into Pascal, then I will probably be able to get translations done with the minimum of help from you.
In the long run, I would be glad to help maintain a self-compiling GPC. If you choose the self-compile route, I'll up my offer of two hours a week to five hours a week for the next three months.
Furthermore, this is not a rush job: the existing GPC works fine for now. If we get the self-compile version done in a year, that will be soon enough, and it will last for the rest of our lives.
Yours, Kevan
Kevan Hashemi wrote:
In order to create a GPC that compiles itself, we need run-time libraries that provide OS routines. I assume that every OS provides such libraries independent of GCC's existence or version in the system.
Again, we have the libraries (i.e., the basic runtime library, not including stuff like graphics, see my mail about GRX, but file handling etc., including everything that a compiler needs -- which is actually not much, as far as the envinronment in concerned). On every platform where GPC runs, almost by definition, its runtime library runs.
Given that GPC's Pascal is, in your own words, Turing Complete, I assume it is possible to write in the current version of GPC the code to implement any future version of GPC, regardless of the new features.
Yes, it's possible, but it's IMHO not conformtable. I know I'm repeating myself, but I'm really fed up with having to "reimplement the list", as I wrote. It's also a psychological thing IMHO -- if advanced data structures (objects, trees, hash tables) are readily available (as in C++), you tend you use them when useful; if they're not (as in Pascal, except for objects), you're much more likely to do quick ad-hoc work-arounds, which affect the efficiency, but also the design of your program, especially in the long run. GCC's TREE_NODEs are a particularly bad example of this, spreading all over the code, but also in my Pascal programs, I often find a lot of ad-hoc list (or other structures) handling code that obscures the actual purpose of my code and thus make it harder to understand later.
Thus, re-writing GPC in Pascal seems to me to be a Pascal programming job. If there are enough development hours available, it is an elegant and long-lasting solution to GPC's problems.
Unfortunately I think that's a big "if". Your offer is appreciated, but AFAICS, even in the optimistic case (no distractions will reduce the available time in the foreseeable future), we're still way short of the necessary "manpower" to create a new compiler.
I can program in Pascal. I could write a lexer, a parser, or any other part of the program.
One note: For any project that I'd be involved in, I'd insist that the parser is created by a tool, not hand-made. It doesn't necessarily have to be Bison, but it seems a good choice since we already have the GPC language grammar in Bison form -- as I wrote, the actions would have to be rewritten, but that's the easy stuff; the hard part, the grammar, could be kept essentially unchanged, including, e.g., the handling of near-ambiguities that arose from the combination of several dialects (or sometimes from single-dialects alone -- Borland, I'm looking in your direction) and other issues.
Why do I think so? Mainly because I'm convinced that designing languages (and a new GPC would add new features, including new syntax) using hand-made parsers usually leads to bad syntax, i.e., ambiguous, nearly ambiguous or otherwise problematic.
A typical example for me is BP (or then, TP) that was written, for all we know, with a parser hand-written in assembler. They designed some problematic syntax, worst of all "^C" style character constants. A parser generator would have told them right away that they cause serious problems (and thus were a bad idea). Writing a parser by hand doesn't tell you so, so they happily added them, probably not even noting that their own new feature didn't work in many cases because they had not thought of them.
Certainly, this was an easy case, and an experienced language designer might have seen it coming without any tools, but in a language as complex as GPC (again, mainly due to the combination of dialects), conflicts are often surprising and in unexpected places, yet a parser generator like Bison detects them.
The lexer (generated by Flex) is a smaller issue, but I see no reason to switch from it either.
I will be happy to help out. If I'm starting with C code that needs to be translated into Pascal, then I will probably be able to get translations done with the minimum of help from you.
It would be more rewriting than translating, i.e. understand what the C code does (with the backend), and implement the same in Pascal (with a, say, C++ output). The big difficulty is to match the behaviour exactly (so the new GPC would be compatible with the exiting one), yet with a completely different target, with largely different internal structures (i.e., high-level structures instead of TREE_NODEs), and in a different language. That (combined with the size of the compiler) is not an easy task by any measure, obviously.
Frank
Dear Frank,
Yes, it's possible, but it's IMHO not conformtable.
Okay, I think I understand the extent of the problems with re-writing the GPC C-code in Pascal, and I appreciate that there may not be enough development hours available.
I am going to make sure that I have a multi-platform Pascal compiler, even if I have to re-write the entire thing myself. I agree with Pascal Viandier: GPC is by far the best Pascal compiler I have ever used, and I have no intention of taking a step backwards to C++ or another Pascal compiler.
One note: For any project that I'd be involved in, I'd insist that the parser is created by a tool, not hand-made.
I agree. Sound like a lot less work too.
It would be more rewriting than translating, i.e. understand what the C code does (with the backend)
If we create a self-compiling compiler, do we need our own assembler written in Pascal as well? Or can we produce GCC-style objects and use any GCC linker and assember to produce our executable? What is it in the GCC interface with Pascal that keeps changing? Is it the object format? You give one example to do with restricted language flags in your report. Perhaps you could give us some more examples.
Yours, Kevan
On Fri, 30 Jul 2010 05:48:29 pm Frank Heckenbach wrote:
A typical example for me is BP (or then, TP) that was written, for all we know, with a parser hand-written in assembler. They designed some problematic syntax, worst of all "^C" style character constants. A parser generator would have told them right away that they cause serious problems (and thus were a bad idea). Writing a parser by hand doesn't tell you so, so they happily added them, probably not even noting that their own new feature didn't work in many cases because they had not thought of them.
I'm sorry, I don't understand what you mean by "^C" style constants, or why they are a bad idea. Can you explain please?
Steven D'Aprano wrote:
On Fri, 30 Jul 2010 05:48:29 pm Frank Heckenbach wrote:
A typical example for me is BP (or then, TP) that was written, for all we know, with a parser hand-written in assembler. They designed some problematic syntax, worst of all "^C" style character constants. A parser generator would have told them right away that they cause serious problems (and thus were a bad idea). Writing a parser by hand doesn't tell you so, so they happily added them, probably not even noting that their own new feature didn't work in many cases because they had not thought of them.
I'm sorry, I don't understand what you mean by "^C" style constants, or why they are a bad idea. Can you explain please?
Borland had the brilliant idea of adding the syntax ^C for character constants to represent "Ctrl-C", i.e. Chr (3).
The major purpose seemed to be that key handlers could be written like this:
case ReadKey of ^A: ...; ^B: ...; end;
The problem? "^" is already used in Pascal syntax, to define pointer types and to dereference pointers.
At first sight, this might seem harmless because it might seem these occur only in different contexts. But it isn't so, and that's why I said an automatic parser generator helps, because it finds the problem immediately (which Bison did for us when we implemented this feature):
type C = Integer; { or whatever } X = ^C; { pointer to C } Y = ^C .. ^D; { character subrange }
As you see, X and Y look the same until "..", but the part before ".." has a radically different meaning. It's not impossible to handle this -- in fact GPC does it with some tricks. But BP itself can't handle it, which obviously means they didn't understand their own feature.
Of course, the whole issue is so ridiculous because this feature is so superfluous. It's a new syntax element for the whole purpose of defining 26 possible constants.(*) It would have been far easier (and unproblematic) just to define them as symbolic constants, say in the CRT unit:
const CtrlA = Chr (1); [...]
(Sure "CtrlA" ist longer to type in the source code than "^C", but for a complete alphabet handling, that's a full 78 characters more -- even for 1980s PCs not an issue.)
(*) Actually that's not quite correct. BP doesn't only accept letters after "^" in this meaning, but any character -- yes, including even "{". I leave it to your darkest fantasies to imagine the ambiguities this can cause. (I suppose this was unintentional, given that the formula which characters they produce is quite strange and only makes sense for letters, but how hard can it be to check that the character is actually a letter?)
Frank
On Wed, 2010-07-28 at 22:52 +0200, Frank Heckenbach wrote: [...]
My one concern would be about producing yet another Pascal object model that is not compatible with any of the existing ones. But I guess that, if the old BP model is still supported so that existing code would compile, that would be fine.
That's kind of what I half expected and half feared. If I'm the only one to want a new (or at least extended) object model (in particular WRT enforced constructors and automatic destructors) or other features (such as templates), there's no point implementing them.
An extended object model is fine, and Templates sound like a very good idea. My main concern was about existing code. If existing code would still compile, then that concern goes away. Enforced constructors are not a problem (that is already required in the Delphi Class model - your program will crash if you don't call a constructor to instantiate your object). I don't know how automatic destructors work, but I don't have a problem with the idea (in Delphi you would have to call object.free - a method, that calls the appropriate destructor). IMHO, any thing that makes the programmer's work less error-prone is good. But it makes sense (if at all possible without a disproportionate amount of effort) to develop/extend the object model in a way that is compatible (as far as possible) with an existing model (perhaps as a superset of one or more).
I don't know what you have in mind WRT the object model, and I don't know whether "automatic" con-/destructors refers to implicit or explicit con-/destructors. And, would there be a common ancestor (type or template) for all objects? And what would be the difference with the BP model?
Example: { BP model } type foo = object end; {foo has no con-/destructor, which are totally unnecessary; and it has no inherent behaviour or attribute }
{ extended model } type foo = extended_object end; {does foo have any con-/destructor, and/or any inherent behaviours or attitudes? what if I want to add my own con-/destructors?}
This is just a matter of curiosity for me, and the actual answer is neither here nor there. I think many programmers tend to look at new compiler features with great interest (as long as they don't break existing code) and I am sure that this will happen with a revamped GPC. It may even be that other Pascal compilers would copy the new features (e.g., Templates).
The bottom line: although a self-compiling GPC would be wonderful if possible without undue effort, it doesn't really matter what language the revamped GPC is written in (although I personally could only hope to contribute Pascal code). C++ is as good as any other language, and I see no philosophical reason against it.
I know very little about C++ standards, so what follows may simply be the result of my ignorance. The question is - how confident are you that the compiler's translation from Pascal to C++ would always compile? What C++ standard would it generate code for? Is there a lowest common denominator "standard" that would compile on any C++ compiler? How would GPC verify that there is a compliant C++ compiler? Would the GPC compiler itself compile with every standards-compliant C++ compiler? (for example, would I be able to build it on Windows with MSVC, and g++, and Borland/Turbo C++? - which would be wonderful).
Conclusion: for my part, as long as one can still run: "gpc foo.pas [blah blah]" and end up with a compiled program "foo[.exe]", then all is well.
If you can provide a list (and specifications) of Pascal tasks that need doing (and required coding principles and conventions) to get the revamped compiler going, then I am sure that there will be many volunteers (present company included).
Prof Abimbola Olowofoyeku (The African Chief) wrote:
On Wed, 2010-07-28 at 22:52 +0200, Frank Heckenbach wrote: [...]
My one concern would be about producing yet another Pascal object model that is not compatible with any of the existing ones. But I guess that, if the old BP model is still supported so that existing code would compile, that would be fine.
That's kind of what I half expected and half feared. If I'm the only one to want a new (or at least extended) object model (in particular WRT enforced constructors and automatic destructors) or other features (such as templates), there's no point implementing them.
An extended object model is fine, and Templates sound like a very good idea. My main concern was about existing code. If existing code would still compile, then that concern goes away. Enforced constructors are not a problem (that is already required in the Delphi Class model - your program will crash if you don't call a constructor to instantiate your object).
That's quite the opposite of what I mean with "enforced constructors". In my meaning (C++) if the class designer provides (one or more) constructors, they can be sure that one of them will be called for every object, whatever the user does (unless, of course, they use low-level tricks to specifically work around it). Also, of course, any internal fields (VMT tables etc.) will be automatically set at this point.
In BP (and AIUI also in Delphi), both things are not guaranteed -- the VMT is not initialized if you just "New" an object pointer in BP without calling a constructor (which will usually cause crashes later on, as soon as you call a virtual method), and none of the provided constructors (if any) is guaranteed to be called.
I don't know how automatic destructors work, but I don't have a problem with the idea (in Delphi you would have to call object.free - a method, that calls the appropriate destructor).
Same as in BP then, so if you forget it, it will not be called. What's more, the compiler cannot automatically call the destructor since there may be several of them, or they may take parameters, so objects can't really be used as temporary values (my example: f (g) where g is a function that returns an object and f is a procedure that takes an object parameter -- since the object exists on temporary and anonymously, the programmer can't call a destructor explictly, so if the compiler can't call the destructor automatically, it can't be called at all).
IMHO, any thing that makes the programmer's work less error-prone is good. But it makes sense (if at all possible without a disproportionate amount of effort) to develop/extend the object model in a way that is compatible (as far as possible) with an existing model (perhaps as a superset of one or more).
Sure.
I don't know what you have in mind WRT the object model, and I don't know whether "automatic" con-/destructors refers to implicit or explicit con-/destructors.
Both (see below).
And, would there be a common ancestor (type or template) for all objects?
In general, no. Neither the BP object model nor the C++ model do, so if only for BP compatibility, there's none. ISTR some of the other models might have one, so for them there would be one.
And what would be the difference with the BP model?
Example: { BP model } type foo = object end; {foo has no con-/destructor, which are totally unnecessary; and it has no inherent behaviour or attribute }
{ extended model } type foo = extended_object end; {does foo have any con-/destructor, and/or any inherent behaviours or attitudes?
C++: Foo has a default constructor and destructor, both empty (no parameters and doing nothing), so in this case the automatic calling of the automatic con-/destructor reduces to nothing (and is, of course, optimized away when safe).
However, if foo has any fields that are objects (not pointers or references, but actual objects), then the default con-/destructor of foo calls those of its fields. A typical example is a string field, where string is an STL type that has to initialize some of its fields to a clean state in its constructor, and to free its dynamic memory in its destructor.
An object with a string field doesn't have to do anything (i.e., the programmer doesn't have to write or declare any con-/destructor, as above), but still gets a default con-/destructor that calls those of its fields, so thanks to the automatic calling of foo's con-/destructor, its string field are handled properly. (But of course, foo *can* declare its own con-/destructors to do additional work.)
If foo inherits from bar, foo's default con-/destructor will also automatically call bar's. This is a big difference from BP where you have to call the inherited con-/destructor explicitly. IMHO that's a big source of errors with negligible benefit.
Furthermore, each class has a "copy constructor" by default that takes a reference to an object of the same class and creates a field-by-field copy (which means a binary copy for simple fields, and for fields that are objects, using their copy constructor). This constructor can be overriden or suppressed by the class designer. So this settles the question (which is somewhat unclear in Pascal) whether objects can be copied (by assignment or passing as value parameters): By default, they can. If the copy constructor is suppressed, they can't. If it's overriden, they can, but they "know about it" -- e.g., if you want each object to have a uniqe ID, you can implement a copy-constructor that creates a new ID and copies all other fields (or whatever you like).
what if I want to add my own con-/destructors?}
As I wrote, in the BP model, they should probably be output as regular C++ methods, and explicit calls (even in "New" and "Dispose") would be translated to explicit method calls. (Which would still allow the automatic C++ constructors to do their work, as far as e.g. fields are concerned, i.e. if a BP object has a field that is a string with an automatic constructor, it wold be initialized even if you don't call the constructor in "New". I'd consider this a feature. In pure BP compatible programs, this situation would simply not occur.)
In the "extended model", it should, of course, be possible to declare real (automatic) C++ con-/destructors with all their benefits (which I hope to explain above). But I haven't thought about details yet, including syntax or whether and how to mix those objects with the other models.
I know very little about C++ standards, so what follows may simply be the result of my ignorance. The question is - how confident are you that the compiler's translation from Pascal to C++ would always compile?
Of course, the new compiler will have bugs, which will have to be found and fixed, and this would be just one class of it. (Actually, less harmful than compiling and producing wrong behaviour.)
What C++ standard would it generate code for? Is there a lowest common denominator "standard" that would compile on any C++ compiler?
Ideally strictly the current C++ standard (C++98), a few years from now perhaps the next standard (C++0x). But if necessary, I'd go for compiler extensions, as long as the compiler is free software and reasonably portable (in particular g++ which is at least as portable as the current GPC, so we wouldn't lose anything).
How would GPC verify that there is a compliant C++ compiler?
By testing. All tests would run through the "New GPC" + C++ stages, and bugs found would have to be traced to see whether they're in the new GPC (most of them) or in the C++ compiler (in which case, as I wrote, they could be submitted as regular bugs to the C++ compiler maintainers -- unlike GCC backend bug reports from the current GPC which have often been ignored).
Would the GPC compiler itself compile with every standards-compliant C++ compiler? (for example, would I be able to build it on Windows with MSVC, and g++, and Borland/Turbo C++? - which would be wonderful).
Ideally yes, though I won't be making extra efforts required to support them. (As a case in point, we recently had to compile a little part of our C++ code under MSVC to interface with 3rd party C++ code under Windows. It was problematic -- it wouldn't even start to compile until we #included some obscure special header, and it needed some other little changes. (If interested, ask Peter for details.) I wouldn't like to sparkle the new compiler with such "#ifdef"s, but if it can be managed to keep them localized, this might be an option. Think of something like os-hacks.h.)
Frank
This discussion is in danger of drifting off topic (the future of GPC development) to religious wars (OSes). So can we please bring it back to the topic?
The most crucial point for me is portability and cross-platform development. Right now, I can write a program, and compile it for Windows, Linux, and my embedded system (accessed via smbfs). If I need to, I can compile the same program for Solaris Sparc and Dos as well (and have done so in the past). This is the reason for my earlier question about which C++ standard the code generated by the renewed GPC would target. It is also the reason why I am in full support of Frank's suggestion of C++, as in preference to any other language as far as the generated code is concerned. I know that almost every decent OS under the planet has a decent C++ compiler (probably g++), and I don't believe that this is likely to change anytime soon. So that seals it for me.
Best regards, The Chief -------- Prof. Abimbola A. Olowofoyeku (The African Chief) web: http://www.greatchief.plus.com/
Deasr Chief,
So that seals it for me.
I agree that a Pascal to C++ translater would be just fine. But on the other hand, that means that we need people who enjoy programming in both Pascal and C++ to support the project. Anyone who fits that description is not going to have much use for the product, because they might as well program in C++ in the first place.
The people who are dedicated to GPC are people who greatly prefer to program in Pascal. Most such people dislike programming in C++. So who is going to write this translator? Not me. Is Frank going to write it on his own?
It may be ten times as much work to re-write the compiler in Pascal, but we may find that we have a hundred times as much developer time available.
One way to proceed is for Frank to estimate how many developer hours are required for a Pascal compiler in Pascal, and for a Pascal to C++ translator, then we poll the list to see how many hours people are prepared to dedicate to each project. If only one of them gets enough hours, then we have only one practical solution.
Yours, Kevan
On Fri, 30 Jul 2010, Kevan Hashemi wrote:
Deasr Chief,
So that seals it for me.
I agree that a Pascal to C++ translater would be just fine. But on the other hand, that means that we need people who enjoy programming in both Pascal and C++ to support the project. Anyone who fits that description is not going to have much use for the product, because they might as well program in C++ in the first place.
The people who are dedicated to GPC are people who greatly prefer to program in Pascal. Most such people dislike programming in C++. So who is going to write this translator? Not me. Is Frank going to write it on his own?
It may be ten times as much work to re-write the compiler in Pascal, but we may find that we have a hundred times as much developer time available.
One way to proceed is for Frank to estimate how many developer hours are required for a Pascal compiler in Pascal, and for a Pascal to C++ translator, then we poll the list to see how many hours people are prepared to dedicate to each project. If only one of them gets enough hours, then we have only one practical solution.
The more I think about it, the clearer it is to me that the Pascal to C++ translator is the option discussed that I like the least. I actually would prefer a Pascal to Ada translator more than Pascal to C++, because if I had to abandon Pascal, I think I'd be much more comfortable programming in Ada than in C/C++ (I don't know enough about D to have an opinion). All things being equal (which they never are), a rewrite of GPC in Pascal is the option I like the best, followed by an effort to add full Extended Pascal/GPC language support to FPC (which has the advantage of already being written in Pascal).
Since I don't have any experience in compiler development, I don't know how useful I could be (except for maybe as a tester), but I think I could devote a couple of hours per Saturday to assist in a rewrite of GPC in Pascal, or on an FPC merger.
--------------------------| John L. Ries | Salford Systems | Phone: (619)543-8880 x107 | or (435)867-8885 | --------------------------|
Frank Heckenbach wrote:
Adriaan van Os wrote:
The devil is in the details. For me, a compiler is a professional tool and the quality of the tool is more important than new features. A handful of weak points may become serious deployment obstacles.
Current GPC problems:
- wrong line numbers in debug code
- wrong and/or incomplete debug-symbol info, notably for 64-bit code
- order-2 performance of unit symbol loading
- order-2 performance of operator overloading symbols
http://www2.gnu-pascal.de/crystal/gpc/en/mail12897.html 5. stalled GCC back-end development/integration (a show stopper when the target OS requires a recent GCC)
<snip>
For (3) and (4) it wouldn't help, but if the new compiler was written in a high-level language (i.e., C++ or its own new Pascal dialect) using high-level data structures (e.g., template based), it would make it easier to tackle those problems.
If each unit compiles into a separate C++ header file, C++ compilation times will suffer, even if the headers are precompiled. One way I can think of to get around that is a whole-program Pascal compilation mode that translates into one whole-program C++ file that copies only those unit-declarations that are actually used in the program.
Regards,
Adriaan van Os
On 29 Jul 2010, at 09:47, Adriaan van Os wrote:
If each unit compiles into a separate C++ header file, C++ compilation times will suffer, even if the headers are precompiled. One way I can think of to get around that is a whole-program Pascal compilation mode that translates into one whole-program C++ file that copies only those unit-declarations that are actually used in the program.
Note that this generally causes even larger compilation times compared to separate files (unless you enable the compiler's whole program optimization infrastructure, in which case using separate files will be a bit slower), because all "regular" optimizations (-Ox) are limited to the current compilation unit.
If you put everything in a single compilation unit, then the compiler can optimize more, but if the goal is speeding up compilation then it will probably be counterproductive (possibly unless you only care about compilation speed for -O0 and not at all when using -O1 or higher).
Jonas
Jonas Maebe wrote:
On 29 Jul 2010, at 09:47, Adriaan van Os wrote:
If each unit compiles into a separate C++ header file, C++ compilation times will suffer, even if the headers are precompiled. One way I can think of to get around that is a whole-program Pascal compilation mode that translates into one whole-program C++ file that copies only those unit-declarations that are actually used in the program.
Note that this generally causes even larger compilation times compared to separate files (unless you enable the compiler's whole program optimization infrastructure, in which case using separate files will be a bit slower), because all "regular" optimizations (-Ox) are limited to the current compilation unit.
If you put everything in a single compilation unit, then the compiler can optimize more, but if the goal is speeding up compilation then it will probably be counterproductive (possibly unless you only care about compilation speed for -O0 and not at all when using -O1 or higher).
It depends where the compiler is spending its time, generating code or processing declarations, reading unit symbols, etcetera. For GPC on Mac OS X, the bottleneck is absolutely the latter. And (for me) compilations during development (with -O0) are more frequent than release buils (with -O3).
Regards,
Adriaan van Os
On 29 Jul 2010, at 11:05, Adriaan van Os wrote:
Jonas Maebe wrote:
If you put everything in a single compilation unit, then the compiler can optimize more, but if the goal is speeding up compilation then it will probably be counterproductive (possibly unless you only care about compilation speed for -O0 and not at all when using -O1 or higher).
It depends where the compiler is spending its time, generating code or processing declarations, reading unit symbols, etcetera. For GPC on Mac OS X, the bottleneck is absolutely the latter.
I don't think the current GPC unit loading architecture is comparable to how a C++ backend would work. And as I said, it is very much depends on the size of the compilation unit: larger compilation units generally take more or less linearly more time to parse, but the optimization time generally rises much more steeply (especially with newer compilers that perform more inter-procedural analyses).
At least that's my experience with compiling Apple's ld64, whose code is virtually all contained in large .hpp files that in turn are included in the main .cpp file. It's only about 680KB of code (mild usage of templates, but no complex template programming), but it takes a disproportionately long time to compile.
Jonas
Jonas Maebe wrote:
At least that's my experience with compiling Apple's ld64, whose code is virtually all contained in large .hpp files that in turn are included in the main .cpp file. It's only about 680KB of code (mild usage of templates, but no complex template programming), but it takes a disproportionately long time to compile.
That sounds like a GNU C++ compiler issue. What if you compile with Clang ?
Regards,
Adriaan van Os
Jonas Maebe wrote:
On 29 Jul 2010, at 11:05, Adriaan van Os wrote:
Jonas Maebe wrote:
If you put everything in a single compilation unit, then the compiler can optimize more, but if the goal is speeding up compilation then it will probably be counterproductive (possibly unless you only care about compilation speed for -O0 and not at all when using -O1 or higher).
It depends where the compiler is spending its time, generating code or processing declarations, reading unit symbols, etcetera. For GPC on Mac OS X, the bottleneck is absolutely the latter.
I don't think the current GPC unit loading architecture is comparable to how a C++ backend would work. And as I said, it is very much depends on the size of the compilation unit: larger compilation units generally take more or less linearly more time to parse, but the optimization time generally rises much more steeply (especially with newer compilers that perform more inter-procedural analyses).
I think so. Also, putting everything in one source file means that everything has to be recompiled with every change anywhere.
What's taking so long currently, Adriaan, is probably the GPI imports. That's independent of the target, i.e. a Pascal to C++ converter would have to do it just the same, so its complexity is independent of how the output is structured. As I said, this is a separate issue which would be easier to tackle if the compiler was written in a high-level language (such as C++ or Pascal with templates). It's easier to find and experiment with efficient data structures (e.g., hash tables, trees) when they're readily available than when you have to manually implement them each and every time like in C (for the current GPC) and also in Pascal so far.
BTW, is it the actual Mac OS X interfaces that are so huge, or your wrappers? I remember you had a long list of string functions for string types of various kinds and lengths. Using templates, they'd shrink drastically.
Frank
Frank Heckenbach wrote:
Jonas Maebe wrote:
What's taking so long currently, Adriaan, is probably the GPI imports. That's independent of the target, i.e. a Pascal to C++ converter would have to do it just the same, so its complexity is independent of how the output is structured. As I said, this is a separate issue which would be easier to tackle if the compiler was written in a high-level language (such as C++ or Pascal with templates). It's easier to find and experiment with efficient data structures (e.g., hash tables, trees) when they're readily available than when you have to manually implement them each and every time like in C (for the current GPC) and also in Pascal so far.
I agree that it is a separate issue and that it is easier to tackle in a future compiler. So, I am moving this to a new thread. We need not discuss it any further now, but I am still following up to answer your questions.
The problem is not so much the speed of GPI loading as such, but the fact that unit-recompilation is of order-2 (in GPC) instead of order-1 (as in FPC).
Imagine a program P that uses unit1 .. unitN, where each unit K uses unit1..unitK-1. Currently, a compile of program P with GPC - triggers a compilation of unit1 and writes a unit1.gpi - triggers a compilation of unit2, which uses unit1, so loads unit1.gpi and writes unit2.gpi - triggers a compilation of unit3, which uses unit1 and unit2, so loads unit1.gpi and unit2.gpi and writes unit3.gpi - etcetera - triggers a compilation of unitK, which uses unit1..unitK-1, so loads unit1.gpi..unitK-1.gpi and writes unitK.gpi - etcetera
So, unit1 is written once and loaded N-1 times, unit2 is written once and loaded N-2 times, etcetera, unitK is written once and loaded N-K times, etcetare. Therefore, N .gpi files are written and (N-1) + (N-2) + ... (N-N) = N * (N-1)/2 .gpi file are loaded.
In other words, compilation times increase quadratic with the number of used units. Improving .gpi load times doesn't help much (only by the square root of the load-time improvement) and the process will still be slow when N is large. The only real solution is to make compilation a linear process, where already-loaded .gpi files are not loaded again. My understanding is that this is difficult to accomplish with the current GPC back-end.
The same problem exists with C/C++ header files and the common solution there is to use include guards http://en.wikipedia.org/wiki/Include_guard.
BTW, is it the actual Mac OS X interfaces that are so huge, or your
It is the actual Mac OS X interfaces that are so huge,
wrappers? I remember you had a long list of string functions for string types of various kinds and lengths. Using templates, they'd shrink drastically.
That was the other problem, not related to the above issue. We are using operator overloading to mimic UCSD-Pascal strings. But, unfortunately, that triggers another quadratic preformance issue in the compiler http://www2.gnu-pascal.de/crystal/gpc/en/mail12897.html.
Regards,
Adriaan van Os
Adriaan van Os wrote:
Frank Heckenbach wrote:
Jonas Maebe wrote:
What's taking so long currently, Adriaan, is probably the GPI imports. That's independent of the target, i.e. a Pascal to C++ converter would have to do it just the same, so its complexity is independent of how the output is structured. As I said, this is a separate issue which would be easier to tackle if the compiler was written in a high-level language (such as C++ or Pascal with templates). It's easier to find and experiment with efficient data structures (e.g., hash tables, trees) when they're readily available than when you have to manually implement them each and every time like in C (for the current GPC) and also in Pascal so far.
I agree that it is a separate issue and that it is easier to tackle in a future compiler. So, I am moving this to a new thread. We need not discuss it any further now, but I am still following up to answer your questions.
The problem is not so much the speed of GPI loading as such, but the fact that unit-recompilation is of order-2 (in GPC) instead of order-1 (as in FPC).
Imagine a program P that uses unit1 .. unitN, where each unit K uses unit1..unitK-1. Currently, a compile of program P with GPC
- triggers a compilation of unit1 and writes a unit1.gpi
- triggers a compilation of unit2, which uses unit1, so loads unit1.gpi and writes unit2.gpi
- triggers a compilation of unit3, which uses unit1 and unit2, so loads unit1.gpi and unit2.gpi and
writes unit3.gpi
- etcetera
- triggers a compilation of unitK, which uses unit1..unitK-1, so loads unit1.gpi..unitK-1.gpi and
writes unitK.gpi
- etcetera
So, unit1 is written once and loaded N-1 times, unit2 is written once and loaded N-2 times, etcetera, unitK is written once and loaded N-K times, etcetare. Therefore, N .gpi files are written and (N-1) + (N-2) + ... (N-N) = N * (N-1)/2 .gpi file are loaded.
Ah, I see. I thought the issue was that a unit import/export with N declarations took O(N^2) time. I think we fixed that (or at least in some cases).
In other words, compilation times increase quadratic with the number of used units. Improving .gpi load times doesn't help much (only by the square root of the load-time improvement) and the process will still be slow when N is large. The only real solution is to make compilation a linear process, where already-loaded .gpi files are not loaded again. My understanding is that this is difficult to accomplish with the current GPC back-end.
It's not about the backend, but the driver/compiler structure (gp (or "gpc --automake") calls gpc, and gpc calls gpc1), and it's not difficult, but impossible in the current structure, because each unit is compiled by a separate process which has to load all used interfaces.
The same problem exists with C/C++ header files and the common solution there is to use include guards http://en.wikipedia.org/wiki/Include_guard.
No, that's not the same problem. The problem there is that the compilation of a single source file would be O(N^2) if it includes N headers, and each header in turn includes the previous ones. (And it would lead to duplicate declarations.)
This problem would *also* occur in Pascal (so the whole compilation would be O(N^3)), if GPC (like probably all other Pascal compilers) didn't prevent it by keeping a list of imported interfaces and avoid duplicate imports.
BTW, is it the actual Mac OS X interfaces that are so huge, or your
It is the actual Mac OS X interfaces that are so huge,
So if I understand you correctly now, the OS interfaces span so many units. Now I understand why you suggested to output them all to a single C++ file (since they probably change rarely, only with upgrades, and then all at once).
Can't you just put them all in a single unit, or would it become so large that you couldn't compile it (for memory reasons or so)?
Another idea (which, if possible, works now, and might also be necessary for a future GPC, if any), is to reduce the number of dependencies (maybe requiring some restructuring). I find it hard to believe that really every unit requires all previous ones. Can't they be grouped in "topics" that are mostly independent of each other (so the "uses" graph would look more like a tree than a linear chain; e.g. a roughly binary tree would be O(N log N) which should be acceptable).
Frank
On 31 Jul 2010, at 06:53, Frank Heckenbach wrote:
So if I understand you correctly now, the OS interfaces span so many units. Now I understand why you suggested to output them all to a single C++ file (since they probably change rarely, only with upgrades, and then all at once).
Can't you just put them all in a single unit, or would it become so large that you couldn't compile it (for memory reasons or so)?
Yes, that is possible and there is in fact such a unit (MacOSAll). Some people however prefer to only import what they need rather than everything at once.
Another idea (which, if possible, works now, and might also be necessary for a future GPC, if any), is to reduce the number of dependencies (maybe requiring some restructuring). I find it hard to believe that really every unit requires all previous ones.
The units already only contain the dependencies they need. Restructuring them is not really an option, because they are straight translations of C headers (so that would make updating them to newer versions much more difficult).
Jonas
Hi,
On 7/29/10, Adriaan van Os gpc@microbizz.nl wrote:
It depends where the compiler is spending its time, generating code or processing declarations, reading unit symbols, etcetera. For GPC on Mac OS X, the bottleneck is absolutely the latter.
(Caveat: limited testing) For DJGPP + GPC, the linker seems to be the weakest link. I've heard similar horror stories from Marcov from FPC, and I know they're glad to have their own (Win32-only, for now). Of course, that could just be the COFF backend, ELF may fare better (presumably since all devs use Linux these days). GNU BinUtil's "new and improved" Gold linker (from Ian Lance Taylor from Google) is, last I checked, still only for Linux/ELF/x86 (but written in C++, go figure).
Frank Heckenbach wrote:
- producing LLVM assembly http://llvm.org/docs/LangRef.html
That might be an option. However, the problems I see are (a) I'm not familiar with LLVM assembly (in contrast to C++), and I'm not sure anyone else here is, so it would take additional learning before one could get productive, and (b) it's, of course, low-level, so we'd have to reimplement things like the object-model, templates, exceptions etc., that C++ already has.
LLVM assembly is easy to learn (easier than C++). With regard to implementing object models, GPC has several object-modes and most of them are, as far I know, not compatible with C++. We don't want to implement for example the TP object model with C++ objects, do we ? Or are you hinting specifically at the new envisioned C++ compatible object-model ?
I find LLVM assembly quite attractive, generic and well designed, distant from a moving API. I see a great future for LLVM. GCC looks more like an unstructured pile of macros. If a software project has a large bug database and successive releases only fix "serious regressions for primary targets", then something with that software project is fundamentally wrong (especially if the developers are hostile and aggressive on the subject on their mailing list).
Regards,
Adriaan van Os
Adriaan van Os wrote:
Frank Heckenbach wrote:
- producing LLVM assembly http://llvm.org/docs/LangRef.html
That might be an option. However, the problems I see are (a) I'm not familiar with LLVM assembly (in contrast to C++), and I'm not sure anyone else here is, so it would take additional learning before one could get productive, and (b) it's, of course, low-level, so we'd have to reimplement things like the object-model, templates, exceptions etc., that C++ already has.
LLVM assembly is easy to learn (easier than C++). With regard to implementing object models, GPC has several object-modes and most of them are, as far I know, not compatible with C++. We don't want to implement for example the TP object model with C++ objects, do we ?
I would want to (if we go the converter to C++ route). GPC objects are not (and have never been) binary compatible with BP, and feature-wise, AFAICS, the C++ model is a strict superset of the BP model (and all other relevant models), with the probably necessary mapping of BP con-/destructors to plain methods, as I wrote.
I find LLVM assembly quite attractive, generic and well designed, distant from a moving API. I see a great future for LLVM. GCC looks more like an unstructured pile of macros. If a software project has a large bug database and successive releases only fix "serious regressions for primary targets", then something with that software project is fundamentally wrong (especially if the developers are hostile and aggressive on the subject on their mailing list).
I won't defend the GCC development model (I've been annoyed myself often enough), but I have to say that GCC is quite more than a pile of macros. The interface could be described as such (to a large extent), but internally there's quite a bit more going on, e.g. the various optimizers.
Frank
In message 1280182603.16859.279143@goedel.fjf.gnu.de, Frank Heckenbach ih8mj@fjf.gnu.de writes
So in order to tell whether continuing GPC development is worthwhile, I'd like to know who of you would actually care about major new features in GPC (as opposed to just preserving the status quo), and who would be interested not only in using GPC, but also supporting its continued development, either by actively contributing to it, or -- perhaps in the case of companies that use GPC -- by paying for continued development.
We are a small company that uses GPC in our main application. We don't have a pressing requirement for new features but we would be prepared to consider making a financial contribution towards the cost of continuing development.
Hi,
We have an extensive code base of Extended Pascal that is in active use commercially, and we will continue development in many years to come. We are using the Prospero compiler, which is still serving our needs. But, as Prospero is Windows-only, gpc has always appeared as the most promising escape route to portability, should we need it. Therefore I am hoping that gpc will find its way into the future.
I have a small contribution to the discussion. Regarding the suggestion of turning gpc into a translator to some other language, I would suggest considering the D programming language as the target language, instead of C++. I have been following the development of D for many years, and have always seen similarities between Extended Pascal and D. Off the top of my head there are nested functions, modules, function argument storage classes, better type safety than C++, dynamic arrays with size information, and fast compilation (especially compared to C++). There may be more similarities. D's template design is also much better than C++'s, as is its approach to const-correctness and its alternative to multiple-inheritance. D is designed to be a better language than C++, and I think it is. I think that of modern languages D is most compatible to the way Pascal programmers like to think (pardon the generalization) and although its syntax is C++-inspired, D might be a good candidate to "take over" from Pascal. If gpc would do a good job at generating readable D code, Pascal programmers could choose to continue writing Pascal or make the switch to D completely and be happy with it.
Some pointers: http://en.wikipedia.org/wiki/D_%28programming_language%29 http://www.digitalmars.com/d/
Good luck, Bastiaan Veelo.
__________ Information from ESET NOD32 Antivirus, version of virus signature database 5321 (20100728) __________
The message was checked by ESET NOD32 Antivirus.
Am 29.07.2010 01:13, schrieb Bastiaan Veelo:
I would suggest considering the D programming language as the target language, instead of C++.
In this case, as we talk about target languages, I would like to add an idea.
There could be something written like:
pascal -> target language
or
pascal -> model language (ML) and then plugins (plugouts :-) of the form ML -> D ML -> C++ ML -> Haskell ML -> Python and so on.
But as you know, "someone" has to write all of this. If we have 2 languages, and both of them aren't that widespreaded, L1 -> L2 converter won't be used nor written at all.
An other idea is to write extensions to {FPC, p2c} / fork {FPC, p2c} which may reduce devel time.
Eike
On 10-07-26 06:16 PM, Frank Heckenbach wrote:
Hi everybody,
I don't really know how many of you currently use GPC, and to what extent and in which ways, e.g., do you use it just to maintain some legacy code, or are you actively writing new applications?
<snip>
Dear Mr Heckenbach,
I spent 3 years converting our legacy Pascal applications from SUN Pascal to GNU Pascal under Sparc Solaris. Many big programs, in fact, with a mixture of Pascal, C, C++, ESQL, 4GL, name it... The conversion was an adventure, mostly because the SUN Pascal compiler had digested bugs in our code for years without detecting them. By using GPC, we found approximately 350 bugs at compile time and more than 450 runtime errors that were never seen before. The result is we produced the most reliable release ever of our programs.
So, maybe what is under the hood of GPC is not as clean as it could (As you mention in your article), but GPC is the best Pascal compiler I ever used, And your support for it has been five stars quality.
The conversion project finished a year ago and I must say it has been a great success - and a lifesaver for our company since SUN stopped its Pascal compiler's support in year 2000.
Then our biggest customer asked: "What about running your applications under Linux?" Since GPC is multiplatform the adventure restarted but, this time, it took only 2 months to have all our apllications running under Linux. And the biggest part of the job was in the C/C++ code!
I cannot see any Pascal compiler other than GPC that had permited this.
By the way, we still develop in Pascal - though partially - so we may be interested in new features. The choice of the new GPC to be written in Pascal or to be a C++ translator does not matter for us as long as it stays as strong and portable as it is now - and stays compatible with the current one.
Again many thanks for this great job, Frank.
Pascal Viandier
Pascal Viandier wrote:
I spent 3 years converting our legacy Pascal applications from SUN Pascal to GNU Pascal under Sparc Solaris. Many big programs, in fact, with a mixture of Pascal, C, C++, ESQL, 4GL, name it... The conversion was an adventure, mostly because the SUN Pascal compiler had digested bugs in our code for years without detecting them. By using GPC, we found approximately 350 bugs at compile time and more than 450 runtime errors that were never seen before. The result is we produced the most reliable release ever of our programs.
So, maybe what is under the hood of GPC is not as clean as it could (As you mention in your article), but GPC is the best Pascal compiler I ever used, And your support for it has been five stars quality.
Thanks. Indeed, the "internal" and "external" quality in this regard is almost unrelated, so I'm not surprised by your observations (thought the numbers are quite impressive :-).
Compile-time error checking is almost entirely a front-end issue and relatively straightforward, without the backend getting in the way, apart from its data structures, i.e. if you want to check e.g. "this type must be an integer type", you have to know how to express this in TREE_NODEs (which is among the basic things one has to learn when working on GCC), but otherwise you just do your checks and emit your errors.
For runtime errors, of course, you need to emit code, i.e. through the backend, but usually this code is relatively easy (e.g. for range-checking, 1 or 2 comparisons and a call to an error routine -- which is implemented in the RTS in Pascal, not produced by the compiler). The more difficult part is figuring out where to implement which checks, which again is almost entirely a frontend issue.
So it's no surprise that GPC has become quite good at catching errors (as well as adding simple features which has become easier than 10 years ago). But at the same time, it has run into a (possible) dead-end with backend integration, and implementing larger features (e.g., new object model, exceptions, templates) in the current GPC is more difficult than it would be with a cleanly written compiler.
Frank
Interesting discussion, no doubt.
Frank's original question: should GNU Pascal live/die?
This evolved into compiler writing, other languages, OS, etc.
Let me state my interest: I am committed to programming in Pascal on Windows. I am not interested in C, D, E, LLVM ...
I write large scientific programs that do things.
I am not interested in programming to test the programming environment, which many of the discussants here seem most interested in.
All programming languages move on,and develop; I do not understand the interest in going back to standards of long ago.
The only good suggestion I have seen in this discussion is to merge GNU Pascal and Free Pascal, to avoid the duplication of effort now going on. I do not think the result will ever be competitive with Delphi (which hardly anyone here mentions).
As to O/S, the following statistics should interest everyone:
OS Platform Statistics
Windows XP is the most popular operating system. The Windows family counts for almost 90%:
2010 Win7 Vista Win2003 WinXP W2000 Linux Mac June 19.8% 11.7% 1.3% 54.6% 0.4% 4.8% 6.8% May 18.9% 12.4% 1.3% 55.3% 0.4% 4.5% 6.7% April 16.7% 13.2% 1.3% 56.1% 0.5% 4.5% 7.1% March 14.7% 13.7% 1.4% 57.8% 0.5% 4.5% 6.9% February 13.0% 14.4% 1.4% 58.4% 0.6% 4.6% 7.1% January 11.3% 15.4% 1.4% 59.4% 0.6% 4.6% 6.8%
Source: http://www.w3schools.com/browsers/browsers_os.asp
You would think from the amount of ink flowing here about Linux that it must be 10 times as popular as it actually is. Note that Windows consistently scores about 88-89%.
Delphi however does support Linux.
Turbo Pascal evolved into BP, then into Delphi.
Like it or not, Delphi is the best Pascal system going out there, pricey indeed; well worth it in my opinion. It's IDE is remarkably good, its compiler is blindingly fast. Do you know that Delphi is checking your source code as fast as you write it, and instantly flags syntax errors?
I find it important to write for Windows without actually knowing Windows; Delphi is excellent for that; I have found it really hard to write Windows programs in GNU -- it has been so long since I looked at Free Pascal/Lazerus, that I am not sure if it is even possible there.
Delphi is not perfect; that is partly why it is updated almost annually. For instance, its treatment of operator overloading is poor compared to that of GNU Pascal (or Free Pascal if I remember fight). It has expanded the Exit command from function bodies to have a parameter: the function result. This is useful: more so than returning to the GOTO wars of yesteryear.
HF
PS
For the sake of clarity in this discussion, could someone please make a glossary of the many acronyms therein. I do not recognize half of them.
=======================================================
Frank Heckenbach wrote:
Hi everybody,
since GPC development has mostly stalled, I've thought about if and how its development could be continued. Here are my current thoughts about it. Since the text is quite long, I put it on the web: http://fjf.gnu.de/gpc-future.html
As I write there, I don't see much future in the current way of developing GPC, but also alternative development models will not be a task for a single person. In other words, without some new contributors, there is probably no chance for GPC development to continue.
I don't really know how many of you currently use GPC, and to what extent and in which ways, e.g., do you use it just to maintain some legacy code, or are you actively writing new applications?
So in order to tell whether continuing GPC development is worthwhile, I'd like to know who of you would actually care about major new features in GPC (as opposed to just preserving the status quo), and who would be interested not only in using GPC, but also supporting its continued development, either by actively contributing to it, or -- perhaps in the case of companies that use GPC -- by paying for continued development.
If it turns out there is no sufficient amount of interest, I'm afraid to say it's probably better to put an end to it now rather than further dragging along. (Of course, the existing GPC versions will continue to be available, and anyone who wants to can use and modify them, which the GPL already guarantees, but without prospects for the future, I would then retire from GPC development and start to rewrite my own Pascal programs in other languages.)
Frank
Dear Harley,
The Windows family counts for almost 90%:
Not in academia, it doesn't. Here at Brandeis University, there are more students with Apple lap-tops than Windows lap-tops. There are more and more students using Linux in preference to Windows. Yesterday I wiped another Windows drive and my student installed Ubunto. We're sick of Windows.
Only when you get into the vast realm of coporate America do you come across 90% Windows, and those people don't use the kind of software we're writing. They are still using XP because they are stuck with it from all their custom-made software packages from ten years ago.
I suspect that your 90% statistic does not apply to the GPC programmer's customers. In the long run, it could be that Windows, being a costly operating sytem, is going to die. Indeed, it is my belief that it will. Everywhere in the Physics community, Linux and MacOS are taking over.
Yours, Kevan
I have to agree with this. At our research institute all three OSes (Mac, Linux, Windows) are represented with Mac laptops scoring the highest among students. Because of the mix, platform independence is important to us and when "in house" programs are written we strictly separate "do things" from "present things on screen". So, free multiplatform languages like GPC, FPC and of course GNU C(++) and Fortran are way more in use than for instance Delphi, which is tied to Windows.
Cheers, Gorazd
Cheers, Gorazd
----- Original Message ----
From: Kevan Hashemi hashemi@brandeis.edu To: Prof. Harley Flanders harley@umich.edu Cc: Frank Heckenbach ih8mj@fjf.gnu.de; gpc@gnu.de Sent: Fri, July 30, 2010 1:19:33 PM Subject: Re: Quo vadis, GPC?
Dear Harley,
The Windows family counts for almost 90%:
Not in academia, it doesn't. Here at Brandeis University, there are more students with Apple lap-tops than Windows lap-tops. There are more and more students using Linux in preference to Windows. Yesterday I wiped another Windows drive and my student installed Ubunto. We're sick of Windows.
Only when you get into the vast realm of coporate America do you come across 90% Windows, and those people don't use the kind of software we're writing. They are still using XP because they are stuck with it from all their custom-made software packages from ten years ago.
I suspect that your 90% statistic does not apply to the GPC programmer's customers. In the long run, it could be that Windows, being a costly operating sytem, is going to die. Indeed, it is my belief that it will. Everywhere in the Physics community, Linux and MacOS are taking over.
Yours, Kevan
-- Kevan Hashemi, Electrical Engineer Physics Department, Brandeis University http://alignment.hep.brandeis.edu/
The Windows family counts for almost 90%:
Not in academia, it doesn't.
Right - that's the missing 10%, mostly made up of acedamia and the entertainment industry.
Walk outside into the real world (which you eventually will, like it or not), however, and UNIX/MacOS/Linux systems virtually disappear.
40 years of UNIX, 20 years of Linux, and 10 years of MacOSX have failed to produce a combined market share greater than 9%, which has been flatlined for over a decade.
Only 1 out of 10 people want/need to use it.
-----Original Message----- From: gpc-owner@gnu.de [mailto:gpc-owner@gnu.de] On Behalf Of Kevan Hashemi Sent: Friday, July 30, 2010 11:20 AM To: Prof. Harley Flanders Cc: Frank Heckenbach; gpc@gnu.de Subject: Re: Quo vadis, GPC?
Dear Harley,
The Windows family counts for almost 90%:
Not in academia, it doesn't. Here at Brandeis University, there are more
students with Apple lap-tops than Windows lap-tops. There are more and more students using Linux in preference to Windows. Yesterday I wiped another Windows drive and my student installed Ubunto. We're sick of Windows.
Only when you get into the vast realm of coporate America do you come across 90% Windows, and those people don't use the kind of software we're writing. They are still using XP because they are stuck with it from all their custom-made software packages from ten years ago.
I suspect that your 90% statistic does not apply to the GPC programmer's
customers. In the long run, it could be that Windows, being a costly operating sytem, is going to die. Indeed, it is my belief that it will. Everywhere in the Physics community, Linux and MacOS are taking over.
Yours, Kevan
On Fri, 30 Jul 2010, Hodges, Robert CTR USAF AFMC 520 SMXS/MXDEC wrote:
The Windows family counts for almost 90%:
Not in academia, it doesn't.
Right - that's the missing 10%, mostly made up of acedamia and the entertainment industry.
Walk outside into the real world (which you eventually will, like it or not), however, and UNIX/MacOS/Linux systems virtually disappear.
40 years of UNIX, 20 years of Linux, and 10 years of MacOSX have failed to produce a combined market share greater than 9%, which has been flatlined for over a decade.
Only 1 out of 10 people want/need to use it.
But that doesn't mean that those 9% of computer users shouldn't be accomodated, particularly by their fellows. I would also argue that deliberately reinforcing that 90% market share (used to be higher) is not in the interest of anybody but Microsoft and its "partners". I have no interest in turning this discussion into an debate over MS, but the computer industry suffers not from too much competition, but too little.
Understand that while the vast majority of computer users use Windows, many others prefer other systems and are willing to sacrifice a considerable amount of convenience to do so. This is a feature, not a bug.
--------------------------| John L. Ries | Salford Systems | Phone: (619)543-8880 x107 | or (435)867-8885 | --------------------------|
-----Original Message----- From: gpc-owner@gnu.de [mailto:gpc-owner@gnu.de] On Behalf Of Kevan Hashemi Sent: Friday, July 30, 2010 11:20 AM To: Prof. Harley Flanders Cc: Frank Heckenbach; gpc@gnu.de Subject: Re: Quo vadis, GPC?
Dear Harley,
The Windows family counts for almost 90%:
Not in academia, it doesn't. Here at Brandeis University, there are more
students with Apple lap-tops than Windows lap-tops. There are more and more students using Linux in preference to Windows. Yesterday I wiped another Windows drive and my student installed Ubunto. We're sick of Windows.
Only when you get into the vast realm of coporate America do you come across 90% Windows, and those people don't use the kind of software we're writing. They are still using XP because they are stuck with it from all their custom-made software packages from ten years ago.
I suspect that your 90% statistic does not apply to the GPC programmer's
customers. In the long run, it could be that Windows, being a costly operating sytem, is going to die. Indeed, it is my belief that it will. Everywhere in the Physics community, Linux and MacOS are taking over.
Yours, Kevan
-- Kevan Hashemi, Electrical Engineer Physics Department, Brandeis University http://alignment.hep.brandeis.edu/
On Fri, 30 Jul 2010, Prof. Harley Flanders wrote:
OS PLATFORM STATISTICS
Windows XP is the most popular operating system. The Windows family counts for almost 90%:
You would think from the amount of ink flowing here about Linux that it must be 10 times as popular as it actually is. Note that Windows consistently scores about 88-89%.
That would be the nature of this list. GNU Pascal is part of the GNU project, which has always been UNIX oriented. One would also expect to find in any GNU related project a large number of people who are philosophically committed to free software, which will tend to bias people in favor of Linux, which is the single most important free (FSF definition) operating system. Also, the GNU project has done a very good job, over the years, of writing highly portable software, which is of benefit to all computer users, regardless of the platform (architecture+OS) on which they work (even, IMHO, Windows). GPC has done a very good job of maintaining platform independence, which is a very good thing that I hope will continue. I particularly appreciate this aspect, of GPC, as I routinely develop for multiple platforms and have for nearly all of my professional career.
I should note that while I don't agree with RMS and his followers that free software is a moral imperative, I greatly respect those who believe that strongly enough to act on it by developing high quality software available to anyone who wants to use it, even though they know they'll never get rich doing it. At the very least, they should be thanked for opening up a software market that had become largely noncompetitive.
--------------------------| John L. Ries | Salford Systems | Phone: (619)543-8880 x107 | or (435)867-8885 | --------------------------|
Well spoken.
-----Original Message----- From: gpc-owner@gnu.de [mailto:gpc-owner@gnu.de] On Behalf Of John L. Ries Sent: Friday, July 30, 2010 11:34 AM To: gpc@gnu.de Subject: Re: Quo vadis, GPC?
On Fri, 30 Jul 2010, Prof. Harley Flanders wrote:
OS PLATFORM STATISTICS
Windows XP is the most popular operating system. The Windows family counts for almost 90%:
You would think from the amount of ink flowing here about Linux that it must be 10 times as popular as it actually is. Note that Windows consistently scores about 88-89%.
That would be the nature of this list. GNU Pascal is part of the GNU project, which has always been UNIX oriented. One would also expect to find in any GNU related project a large number of people who are philosophically committed to free software, which will tend to bias people in favor of Linux, which is the single most important free (FSF definition) operating system. Also, the GNU project has done a very good job, over the years, of writing highly portable software, which is of benefit to all computer users, regardless of the platform (architecture+OS) on which they work (even, IMHO, Windows). GPC has done a very good job of maintaining platform independence, which is a very good thing that I hope will continue. I particularly appreciate this aspect, of GPC, as I routinely develop for multiple platforms and have for nearly all of my professional career.
I should note that while I don't agree with RMS and his followers that free software is a moral imperative, I greatly respect those who believe that strongly enough to act on it by developing high quality software available to anyone who wants to use it, even though they know they'll never get rich doing it. At the very least, they should be thanked for opening up a software market that had become largely noncompetitive.
--------------------------| John L. Ries | Salford Systems | Phone: (619)543-8880 x107 | or (435)867-8885 | --------------------------|
John L. Ries wrote:
On Fri, 30 Jul 2010, Prof. Harley Flanders wrote:
That would be the nature of this list. GNU Pascal is part of the GNU project, which has always been UNIX oriented. One would also expect to find in any GNU related project a large number of people who are philosophically committed to free software, which will tend to bias people in favor of Linux, which is the single most important free (FSF definition) operating system. Also, the GNU project has done a very good job, over the years, of writing highly portable software, which is of benefit to all computer users, regardless of the platform (architecture+OS) on which they work (even, IMHO, Windows). GPC has done a very good job of maintaining platform independence, which is a very good thing that I hope will continue. I particularly appreciate this aspect, of GPC, as I routinely develop for multiple platforms and have for nearly all of my professional career.
I should note that while I don't agree with RMS and his followers that free software is a moral imperative, I greatly respect those who believe that strongly enough to act on it by developing high quality software available to anyone who wants to use it, even though they know they'll never get rich doing it. At the very least, they should be thanked for opening up a software market that had become largely noncompetitive.
Well spoken.
Regards,
Adriaan van Os
Prof. Harley Flanders wrote:
As to O/S, the following statistics should interest everyone:
OS Platform Statistics
Windows XP is the most popular operating system. The Windows family counts for almost 90%:
2010 Win7 Vista Win2003 WinXP W2000 Linux Mac June 19.8% 11.7% 1.3% 54.6% 0.4% 4.8% 6.8% May 18.9% 12.4% 1.3% 55.3% 0.4% 4.5% 6.7% April 16.7% 13.2% 1.3% 56.1% 0.5% 4.5% 7.1% March 14.7% 13.7% 1.4% 57.8% 0.5% 4.5% 6.9% February 13.0% 14.4% 1.4% 58.4% 0.6% 4.6% 7.1% January 11.3% 15.4% 1.4% 59.4% 0.6% 4.6% 6.8%
Except that browsers are not exactly the same thing as programming languages.
You would think from the amount of ink flowing here about Linux that it must be 10 times as popular as it actually is. Note that Windows consistently scores about 88-89%.
I hope you don't mean this as a complaint. If there are so many Windows programmers lurking, they're certainly allowed to speak up.
For what I know, of the 5 main developers GPC has had, the percentage of Windows users (as their main OS) is exactly 0%. So if your statistic was relevant, there should be dozens of Windows programmers just waiting to take over as main developers and GPC's future was secured for decades. Or maybe a lot of those 90% are just computer "consumers" who wouldn't notice if the OS, hardware and applications on their computer were exchanged completely, as long as their icons look the same -- and you wonder why they're not here to discuss the future of a Pascal compiler?
John L. Ries wrote:
Also, the GNU project has done a very good job, over the years, of writing highly portable software, which is of benefit to all computer users, regardless of the platform (architecture+OS) on which they work (even, IMHO, Windows). GPC has done a very good job of maintaining platform independence, which is a very good thing that I hope will continue. I particularly appreciate this aspect, of GPC, as I routinely develop for multiple platforms and have for nearly all of my professional career.
I'd add that it's not only (or even mainly) the GNU project, but standards such as POSIX that foster portability. Linux supports POSIX (plus extensions), Windows also (grudgingly) supports POSIX (minus some flaws), so a port from Linux to Windows is not always trivial, but possible with some effort.
OTOH, native Windows programs use a completely different API, and a port from Windows to Linux is a much larger task, basically rewriting anything related to the OS (not only GUIs, also file I/O uses a completely different API). Most Windows programmers don't care about this, or at best only as an afterthought.
Prof. Harley Flanders wrote:
Delphi however does support Linux.
Does it? AFAIK, there was a short-lived attempt named Kylix which has long been abandoned.
Like it or not, Delphi is the best Pascal system going out there, pricey indeed; well worth it in my opinion.
Fortunately, opinions can differ. In my opinion, software that is non-free and non-portable is never "the best".
It's IDE is remarkably good, its compiler is blindingly fast. Do you know that Delphi is checking your source code as fast as you write it, and instantly flags syntax errors?
If you like Delphi, good for you, then you should have no problems switching if GPC dies.
Side note: On-the-fly syntax checking and similar features are things that I do consider useful and would like to implement in my IDE (PENG). Of course, I need time to do it, time which I've so far spent more on GPC ...
Delphi [...] has expanded the Exit command from function bodies to have a parameter: the function result.
BTW, GPC supports that as "return".
For the sake of clarity in this discussion, could someone please make a glossary of the many acronyms therein. I do not recognize half of them.
Otherwise, it might help if you told us which acronyms in particular you'd like expanded.
Hodges, Robert CTR USAF AFMC 520 SMXS/MXDEC wrote:
The Windows family counts for almost 90%:
Not in academia, it doesn't.
Right - that's the missing 10%, mostly made up of acedamia and the entertainment industry.
Walk outside into the real world (which you eventually will, like it or not), however, and UNIX/MacOS/Linux systems virtually disappear.
Strange, since I'm neither in academia or the entertainment industry, along with Peter and other people I work with, we must be virtually nonexistent. I wonder who supported and extended GPC between 1996 (Jukka, Jan-Jaap) and 2002 (Waldek).
Only 1 out of 10 people want/need to use it.
Or market lock-in is at work ...
John L. Ries wrote:
Understand that while the vast majority of computer users use Windows, many others prefer other systems and are willing to sacrifice a considerable amount of convenience to do so. This is a feature, not a bug.
*Sacrifice* convenience? The times I've had to sacrifice convenience were when I was "forced" to use a Windows system (fortunately not often). Even genuine GUI features like copy & paste (which are supposedly Windows' strength, I don't even have to talk about traditional Unix strengths such as file systems and networks) work so much better/easier in Linux GUIs like KDE than under Windows. I'm sure many of those "90%" would agree if they ever made an unbiased comparison (but of course, MS tries to make sure this never happens).
Prof A Olowofoyeku (The African Chief) wrote:
This discussion is in danger of drifting off topic (the future of GPC development) to religious wars (OSes). So can we please bring it back to the topic?
Probably not, but I changed the subject line. ;-)
The most crucial point for me is portability and cross-platform development. Right now, I can write a program, and compile it for Windows, Linux, and my embedded system (accessed via smbfs). If I need to, I can compile the same program for Solaris Sparc and Dos as well (and have done so in the past).
I did the same with Linux, Solaris, Dos and occasionally Windows.
This is the reason for my earlier question about which C++ standard the code generated by the renewed GPC would target.
In this case, the question is not so pressing, since g++ is at least as portable as gpc (using the same backend and better supported), and any C++ converter output should, of course, at least be compilable with g++.
John L. Ries wrote:
I should note that while I don't agree with RMS and his followers that free software is a moral imperative, I greatly respect those who believe that strongly enough to act on it by developing high quality software available to anyone who wants to use it, even though they know they'll never get rich doing it. At the very least, they should be thanked for opening up a software market that had become largely noncompetitive.
This statement was explictly acked by two other persons (it almost seems it's the only thing at all in this thread that more than two people have agreed on so far). If the general opinion here is that the project is a take-away or free software is some kind of charity, it also doesn't bode well for future projects.
This applies especially to Windows users. So far, the only major GPC contributor who is a Windows user has been The Chief who builds Windows binaries and, together with me, implemented special Windows support in some runtime units and the RTS. That's right -- I actually helped him support Windows-only features. On the opposite, I've yet to see a Windows user (who is not also a Linux user) supporting Linux-only features. And then I have to read (see above) how Windows is underrepresented and not talked about enough.
So for those who think free software means they just get it without paying (free beer, not free speech) and yet get to tell us what we should do and complain when we do what we consider important, not what they consider important (instead of doing it themselves), that won't work. Sure, one of the points of free software is that anyone can use it for any purpose, but there's also a rule: Those who code make the decisions. And by this measure, if I look through the mailing list archives, Dos and Windows have actually been vastly overrepresented.
One reason why successful free software projects thrive is contributions by users. This is true for big projects such as GNU and Linux (of the current source code in the GNU project and in Linux, RMS and Linus, respectively, have written only a small part) or TeX (though TeX itself is mostly Knuth's original code, what made it most useful was LaTeX and many 3rd party packages), but also for smaller projects, in particular also FPC.
With GPC this hasn't happened much. Even if working on the GPC compiler itself was difficult due to its backend dependency, this was no reason to prevent contributions to Pascal units, both the units that come with GPC, including most of its runtime system, and 3rd party units or applications (e.g. IDEs). But this happened only to a small degree. I can only speculate on why, whether it's mentality (as expressed in statements such as the above), or that everyone is just too busy working on their in-house applications, or even the large array of dialects that GPC supports (which are normally thought of as an asset, but may have become a problem in that they stifle cooperation between users if everybody uses their own set of features).
Frank
Am 31.07.2010 09:12, schrieb Frank Heckenbach:
One reason why successful free software projects thrive is contributions by users. This is true for big projects such as GNU and Linux (of the current source code in the GNU project and in Linux, RMS and Linus, respectively, have written only a small part) or TeX (though TeX itself is mostly Knuth's original code, what made it most useful was LaTeX and many 3rd party packages), but also for smaller projects, in particular also FPC.
With GPC this hasn't happened much. Even if working on the GPC compiler itself was difficult due to its backend dependency, this was no reason to prevent contributions to Pascal units, both the units that come with GPC, including most of its runtime system, and 3rd party units or applications (e.g. IDEs). But this happened only to a small degree. I can only speculate on why, whether it's mentality (as expressed in statements such as the above), or that everyone is just too busy working on their in-house applications, or even the large array of dialects that GPC supports (which are normally thought of as an asset, but may have become a problem in that they stifle cooperation between users if everybody uses their own set of features).
I think you miss one important point here: to contribute to an OSS project one must be rather idealistic or even religious about it (yes, this includes me ;)). But in general those people are not only religious about the license of the software but also in other regards of IT. So it's quite natural that someone who works on a OSS compiler considers also the language used to write the compiler as important. This applies even to the GUI: just look at the MSEGui or fpGUI (http://opensoft.homeip.net/fpgui/): people are writing complete GUI libraries in FPC (this goes beyond lazarus which wraps only GTK/win32/qt).
On Sat, 31 Jul 2010 06:22:00 pm Florian Klämpfl wrote:
I think you miss one important point here: to contribute to an OSS project one must be rather idealistic or even religious about it (yes, this includes me ;)).
"Must"?
That's probably why Microsoft agreed to match Jeff Atwood's donation of $5000 to support OSS Dot-Net development. Microsoft is famous for being idealistic and religious about open source.
http://www.codinghorror.com/blog/2007/06/supporting-open-source-projects-in-...
Not to mention idealistic and religious companies like Fujitsu, HP, Skype, to mention just a few...
http://www.postgresql.org/about/sponsors
Am 31.07.2010 19:16, schrieb Steven D'Aprano:
On Sat, 31 Jul 2010 06:22:00 pm Florian Klämpfl wrote:
I think you miss one important point here: to contribute to an OSS project one must be rather idealistic or even religious about it (yes, this includes me ;)).
"Must"?
"One" in the sense of person obviously and contribute in the sense of coding or why should one code for free? How much time did you spend so far in writing OSS?
That's probably why Microsoft agreed to match Jeff Atwood's donation of $5000 to support OSS Dot-Net development. Microsoft is famous for being idealistic and religious about open source.
And? $10000 is nothing. This approximately is what a senior software developer/architect with PhD./Master/Dipl.-Inf./Dpl.-Ing. costs a company per month.
http://www.codinghorror.com/blog/2007/06/supporting-open-source-projects-in-...
Not to mention idealistic and religious companies like Fujitsu, HP, Skype, to mention just a few...
On Sun, 1 Aug 2010 03:33:15 am Florian Klämpfl wrote:
Am 31.07.2010 19:16, schrieb Steven D'Aprano:
On Sat, 31 Jul 2010 06:22:00 pm Florian Klämpfl wrote:
I think you miss one important point here: to contribute to an OSS project one must be rather idealistic or even religious about it (yes, this includes me ;)).
"Must"?
"One" in the sense of person obviously and contribute in the sense of coding or why should one code for free?
Who says "one" is coding for free?
Companies like Google, Apple, Red Hat and many others, pay programmers to work on OSS, not because of a sense of idealism but because it suits their business model.
That's probably why Microsoft agreed to match Jeff Atwood's donation of $5000 to support OSS Dot-Net development. Microsoft is famous for being idealistic and religious about open source.
And? $10000 is nothing. This approximately is what a senior software developer/architect with PhD./Master/Dipl.-Inf./Dpl.-Ing. costs a company per month.
Yes, and? The point which seems to have escaped you is that *Microsoft*, a company that a few years ago described Linux and OSS as "cancer", has now realised the business worth of it to *themselves*. It might only be a tiny contribution so far, but Microsoft aren't doing it for idealistic or religious reasons. They didn't become the worlds biggest IT company by writing out $10,000 cheques for nothing.
And I noticed that you just skipped over the companies funding Postgresql development:
Not to mention idealistic and religious companies like Fujitsu, HP, Skype, to mention just a few...
So please, stop making out that OSS is solely some sort of idealistic crusade. People contribute to OSS for all sorts of reasons, including business sense, legal requirements, and others.
Am 01.08.2010 23:07, schrieb Steven D'Aprano:
On Sun, 1 Aug 2010 03:33:15 am Florian Klämpfl wrote:
Am 31.07.2010 19:16, schrieb Steven D'Aprano:
On Sat, 31 Jul 2010 06:22:00 pm Florian Klämpfl wrote:
I think you miss one important point here: to contribute to an OSS project one must be rather idealistic or even religious about it (yes, this includes me ;)).
"Must"?
"One" in the sense of person obviously and contribute in the sense of coding or why should one code for free?
Who says "one" is coding for free?
You still didn't answer how much you contributed to OSS so far. How much did you? How much did you earn with it so far? Is it enough to pay your bills? If not, it's idealistic.
I develop OSS longer than the term OSS exists (acutally since 1993) and 99% of all developement is done for idealistic reasons according to my experiences and I met lot of people from a lot of different projects. Even if people earn money with it, a idealistic component is required due to poor payment.
Companies like Google, Apple, Red Hat and many others, pay programmers to work on OSS, not because of a sense of idealism but because it suits their business model.
This are a few big projects. Just ask the GPC maintainers how much man month they got funded so far.
And I noticed that you just skipped over the companies funding Postgresql development:
Not to mention idealistic and religious companies like Fujitsu, HP, Skype, to mention just a few...
So please, stop making out that OSS is solely some sort of idealistic crusade.
See above.
People contribute to OSS for all sorts of reasons, including business sense, legal requirements, and others.
This affects only a few big projects and a few people.
On 02 Aug 2010, at 09:01, Florian Klaempfl wrote:
I develop OSS longer than the term OSS exists (acutally since 1993) and 99% of all developement is done for idealistic reasons according to my experiences and I met lot of people from a lot of different projects. Even if people earn money with it, a idealistic component is required due to poor payment.
Or because it's a hobby (like writing, painting, sports, theatre, playing computer games, ...)
Jonas
Am 02.08.2010 10:47, schrieb Jonas Maebe:
On 02 Aug 2010, at 09:01, Florian Klaempfl wrote:
I develop OSS longer than the term OSS exists (acutally since 1993) and 99% of all developement is done for idealistic reasons according to my experiences and I met lot of people from a lot of different projects. Even if people earn money with it, a idealistic component is required due to poor payment.
Or because it's a hobby (like writing, painting, sports, theatre, playing computer games, ...)
Of course, but imo a hobby is also something idealistic.
Steven D'Aprano wrote:
That's probably why Microsoft agreed to match Jeff Atwood's donation of $5000 to support OSS Dot-Net development. Microsoft is famous for being idealistic and religious about open source.
And? $10000 is nothing. This approximately is what a senior software developer/architect with PhD./Master/Dipl.-Inf./Dpl.-Ing. costs a company per month.
Yes, and? The point which seems to have escaped you is that *Microsoft*, a company that a few years ago described Linux and OSS as "cancer", has now realised the business worth of it to *themselves*.
Hold on! *Dot-Net* development isn't exactly supporting Linux! In fact one might suspect that the purpose of the support might be to lure some free software supporters away from free OSs (or stop them from switching to them).
While I agree with your general point, this seems to be a bad example.
Frank
On Mon, 2 Aug 2010 10:10:30 pm Frank Heckenbach wrote:
Steven D'Aprano wrote:
That's probably why Microsoft agreed to match Jeff Atwood's donation of $5000 to support OSS Dot-Net development. Microsoft is famous for being idealistic and religious about open source.
And? $10000 is nothing. This approximately is what a senior software developer/architect with PhD./Master/Dipl.-Inf./Dpl.-Ing. costs a company per month.
Yes, and? The point which seems to have escaped you is that *Microsoft*, a company that a few years ago described Linux and OSS as "cancer", has now realised the business worth of it to *themselves*.
Hold on! *Dot-Net* development isn't exactly supporting Linux! In fact one might suspect that the purpose of the support might be to lure some free software supporters away from free OSs (or stop them from switching to them).
While I agree with your general point, this seems to be a bad example.
I didn't mean to imply that Microsoft was all lovey-dovey with Linux now (although they've been forced by customer demand to be slightly less hostile to it). All I meant was that Microsoft sees value in the OSS model as it applies to .Net. If that's because they think it will lure people away from Linux to Windows, or stop them switching away from Windows, that just goes to support our point that there are lots of reasons for supporting OSS other than idealism.
On 03 Aug 2010, at 02:30, Steven D'Aprano wrote:
I didn't mean to imply that Microsoft was all lovey-dovey with Linux now (although they've been forced by customer demand to be slightly less hostile to it). All I meant was that Microsoft sees value in the OSS model as it applies to .Net. If that's because they think it will lure people away from Linux to Windows, or stop them switching away from Windows, that just goes to support our point that there are lots of reasons for supporting OSS other than idealism.
It's not really supporting in this case. Microsoft is of the opinion that a lot of open source software infringes on their software patents (which is definitely true, just like it is the case for a lot of closed source software and for every other big company's patent portfolio). They believe that they should be compensated for this. With the .NET platform, they managed to get Novell to pay for a patent license for including Mono (an OSS implementation of the .NET platform) in their Linux distribution (SUSE), thereby validating Microsoft's claim that companies who distributes that software should pay them, and in general creating a precedent for paying patent license fees in return for the privilege of developing/distributing open source software.
So Microsoft is not really contributing to open source, but rather trying to get everyone who works on open source to pay them for infringing on their patents. It's a bit like saying that BP contributes to protecting the environment because they pay lobbyists for talking to politicians who work on environmental laws. Sure, they'll support some token "good points", but there's a significant difference between that and actually supporting the overall goal.
There are of course better examples, such as IBM, Google and even Apple. In those cases, it's more that the companies consider it to be more cost-effective to collaborate on basic infrastructure-level software rather than all privately developing basically the same stuff, rather than that they want everyone in the world to pay them for the privilege of writing and distributing free software.
As Florian mentioned however, the developers paid that way make up only a small fraction of the total open source development (many of them are even regular employees of those companies that used to work in similar, in house versions of such software) and unless your project and skills happen to be crucial to a particular company's business model, there is very little chance of ever making a living out of it.
Jonas
On Tue, 3 Aug 2010 07:07:10 pm Jonas Maebe wrote:
It's not really supporting in this case. Microsoft is of the opinion that a lot of open source software infringes on their software patents (which is definitely true
[...]
Unless you can point out the infringing software, and demonstrate what patents it infringes, that's just an supposition. It is not "definitely true" except by doing violence to the idea of "definitely".
So Microsoft is not really contributing to open source, but rather trying to get everyone who works on open source to pay them for infringing on their patents.
In what way does Microsoft writing out a cheque for thousands of dollars to give to developers who develop open sourced .Net software "trying to get everyone who works on open source to pay them for infringing on their patents"?
I mean, yes, I think Microsoft is devious too, but thinking that them funding developers to write OSS is a ploy to force them to pay patent fees is pure tinfoil helmet territory.
The simpler explanation is far more likely. Microsoft has reluctantly come to the conclusion that they can't easily destroy the FOSS community, that they have to co-exist with Linux in the same way that they co-exist with Apple (that is, occasional border skirmishes rather than open war), and they'd rather have people writing FOSS for Windows and .Net than for Linux.
Yes, they're threatening to wield the patent sword (and yet they haven't done so ... you have to wonder if their patents were so sound, why they haven't sued Red Hat out of existence yet?). They can do both at the same time. MS is a huge company, with many different departments with their own budgets to spend and their own managers making their own decisions. Not everything comes from the personal desk of Steve Ballmer.
On 03 Aug 2010, at 17:18, Steven D'Aprano wrote:
On Tue, 3 Aug 2010 07:07:10 pm Jonas Maebe wrote:
It's not really supporting in this case. Microsoft is of the opinion that a lot of open source software infringes on their software patents (which is definitely true
[...]
Unless you can point out the infringing software, and demonstrate what patents it infringes, that's just an supposition. It is not "definitely true" except by doing violence to the idea of "definitely".
(to view the patents, enter the patent numbers at http://patft.uspto.gov/netahtml/PTO/srchnum.htm and click on "Search"; to see the most basic stuff covered by a patent, read claims that do not refer to other claims)
* Microsoft's patent on page up/page down (including a formula on how to calculate how much you have to scroll a text document to show the next page): 7,415,666 * Microsoft sues TomTom over (V)FAT patents infringed by the Linux kernel: http://laforge.gnumonks.org/weblog/2009/02/27/ * Microsoft reaches patent deal with HTC over patents infringed by Android: http://www.pcmag.com/article2/0,2817,2363175,00.asp * SUSE disables sub-pixel anti-aliasing because it infringes on Microsoft patents: http://techrights.org/2007/04/08/patent-font/
In what way does Microsoft writing out a cheque for thousands of dollars to give to developers who develop open sourced .Net software "trying to get everyone who works on open source to pay them for infringing on their patents"?
As I explained in the part you cut away: they are trying to get the open source version of .NET (Mono) gain wide acceptance. And they also reached a patent license deal with Novell that covered Mono (and OpenOffice): see the last FAQ at http://www.novell.com/linux/microsoft/faq.html, implying that everyone who wants to distribute/use those products should also get a patent license from Microsoft (otherwise, why would Novell need one?)
I mean, yes, I think Microsoft is devious too, but thinking that them funding developers to write OSS is a ploy to force them to pay patent fees is pure tinfoil helmet territory.
Tinfoil hats need not apply. Steve Balmer publicly announced that sooner or later someone will have to pay for the Microsoft patents that FOSS projects infringe. See http://www.linux-watch.com/news/NS6670466370.html about the Linux kernel in particular, and http://www.linux-watch.com/news/NS3513440381.html about the Novell deal and veiled threats to users of Red Hat Linux.
Yes, they're threatening to wield the patent sword (and yet they haven't done so ...
They have done plenty of times, see above. And those are only a number of published cases (there are more, involving at least Amazon, Samsung and IO Data). As you may or may not know, most patent cases however never reach the daylight and are settled behind the scenes (it's generally not good business to publish that you are being accused of patent infringement, and patent court cases cost insane amounts of money -- in fact, if the license fee is less than $1 million, in the US the costs of the court case itself will always outweigh the license fee; see slide 9 of http://people.ffii.org/~jmaebe/conf0411/tue/Brian%20Kahin.pdf ).
you have to wonder if their patents were so sound, why they haven't sued Red Hat out of existence yet?).
* Red Hat doesn't have that much money. It's more profitable to sue Linux users (Amazon, Samsung, TomTom, ...) * It could attract unwanted attention from anti-trust authorities (again)
They can do both at the same time. MS is a huge company, with many different departments with their own budgets to spend and their own managers making their own decisions.
Sure, see for example http://news.cnet.com/8301-13505_3-10290686-16.html
.NET is however not a good example.
Jonas
On Wed, 4 Aug 2010 06:02:15 am Jonas Maebe wrote:
On 03 Aug 2010, at 17:18, Steven D'Aprano wrote:
On Tue, 3 Aug 2010 07:07:10 pm Jonas Maebe wrote:
It's not really supporting in this case. Microsoft is of the opinion that a lot of open source software infringes on their software patents (which is definitely true
[...]
Unless you can point out the infringing software, and demonstrate what patents it infringes, that's just an supposition. It is not "definitely true" except by doing violence to the idea of "definitely".
(to view the patents, enter the patent numbers at http://patft.uspto.gov/netahtml/PTO/srchnum.htm and click on "Search"; to see the most basic stuff covered by a patent, read claims that do not refer to other claims)
- Microsoft's patent on page up/page down (including a formula on how
to calculate how much you have to scroll a text document to show the next page): 7,415,666
[snip further examples]
Fair enough. However, until they have been proven in court, it still isn't proof that they definitely are infringing. You have Microsoft's accusation, and (e.g.) HTC's decision to make a deal rather than fight a legal battle that would probably cost them millions even if they win. That's good evidence, but not proof.
In fairness, in hindsight I'd probably agree with you that *some* OSS *somewhere* surely has to be infringing some patent. Whether that patent is actually valid and would itself survive a court challenge is another question. I suspect that's at least partially why MS prefers to talk tough about patents and cross-licence rather than sue.
[...]
Tinfoil hats need not apply. Steve Balmer publicly announced that sooner or later someone will have to pay for the Microsoft patents that FOSS projects infringe.
He can say whatever he likes, but until infringement is either proven in court or admitted (and paying a licence fee or doing a cross-licence patent deal is not admitting infringement), it's just a wild claim.
See http://www.linux-watch.com/news/NS6670466370.html about the Linux kernel in particular, and http://www.linux-watch.com/news/NS3513440381.html about the Novell deal and veiled threats to users of Red Hat Linux.
Yes, they're threatening to wield the patent sword (and yet they haven't done so ...
They have done plenty of times, see above. And those are only a number of published cases (there are more, involving at least Amazon, Samsung and IO Data). As you may or may not know, most patent cases however never reach the daylight and are settled behind the scenes (it's generally not good business to publish that you are being accused of patent infringement, and patent court cases cost insane amounts of money -- in fact, if the license fee is less than $1 million, in the US the costs of the court case itself will always outweigh the license fee; see slide 9 of http://people.ffii.org/~jmaebe/conf0411/tue/Brian%20Kahin.pdf ).
Right, which is why licence deals don't prove that infringement occurs. Patents make a wonderful instrument of intimidation and legal blackmail: "pay us $1M, or your lawyer $10M".
Steven D'Aprano wrote:
He can say whatever he likes, but until infringement is either proven in court or admitted
I'm with you on that count.
When I considered patenting my own ideas, I discovered that you could patent just about anything that had not been patented before. It's only when it comes to enforcing the patent in court that you run up against several other restrictions.
One such restriction is that your patent can't just be some reservation of use of existing technology for some purpose that you thought up, like "I patent the idea of selling software over the internet" (US Patent 4,528,643, issued July 9, 1985, for Freeny, bought by E-Data Corp, who tried to make money out of it, I don't know what happened to that in the end, but I seem to remember it going nowhere). Your patent must specify some process or technology that clearly took some effort to figure out. You can't simply jump to be the first person to patent something obvious. Furthermore, if it turns out that somebody else was already doing what you suggested, then your patent is invalid.
For sure: the rules are vague, and come down to a judgment call in court.
Yours, Kevan
On 04 Aug 2010, at 16:06, Kevan Hashemi wrote:
Steven D'Aprano wrote:
He can say whatever he likes, but until infringement is either proven in court or admitted
I'm with you on that count.
Patent validity and infringement are not objective measures. They are subjective assessments (just like copyright infringement in many cases, for that matter), and often depend on which judge (and in the US: which jury) you have. That's the reason why so many patent infringement lawsuits are filed in the District Court for the Eastern District of Texas, because it turned out to be generally partial to patent holders.
Combined with the fact that patent lawsuits cost insane amounts of money, whether or not you actually even infringe a patent (let alone whether or not it is valid) doesn't even matter in most cases. Sun Microsystems found that out first hand from IBM in the early 80s: http://www.forbes.com/asap/2002/0624/044.html
This means that generic patent threats (such as those made by Ballmer) are actually much worse than "project A infringes patents X, Y and Z" statements, because in the latter case you can at least attempt to work around them. They cause FUD (fear, uncertainty and doubt), just like the patent deal between Microsoft and Novell.
Therefore I stand by my original points that a) Microsoft paying a Mono developer $5,000 in no way demonstrates their alleged acceptance of open source or free software. That action fits perfectly into their patent FUD game (open source software whose usage you can license from Microsoft is good, and they'll even support its development) b) most software probably infringes on at least a couple of software patents out there (ones that will be held valid by at least one judge/ jury somewhere if they would be tried; besides, as long as a regular patent is not explicitly invalidated, it is assumed to be valid by virtue of having passed the examination process)
Your patent must specify some process or technology that clearly took some effort to figure out. You can't simply jump to be the first person to patent something obvious.
Non-obviousness (and in general: patentability) is entirely unrelated to effort. Non-obviousness also doesn't mean the same as it does in common conversation. In general, many things that are new in some (small) way qualify, because of the reasonings that a) if it were that obvious, then it would have been published or patented already b) now that you see it described, it may seem obvious, but hindsight is 20/20. You have to consider whether or not it was obvious assuming you did not read this yet
Especially because of b), obviousness is considered in a quite narrow way by patent examiners (and judges and juries) when assessing patent validity. There are some methods that may help (see e.g. http://en.wikipedia.org/wiki/Inventive_step_and_non-obviousness) , but in general the non-obviousness barrier is quite low. As the UK's Chartered Institute of Patent Attorneys nicely puts it (http://www.cipa.org.uk/pages/advice-patents ):
"So, unless the feature that makes your invention “new”, compared with what has gone before, is utterly trivial, then it is usually best for you to assume that it does involve an inventive step."
("inventive step" is the European equivalent to the US "non- obviousness" requirement in patent law)
Jonas
On Thu, 5 Aug 2010 12:55:20 am Jonas Maebe wrote:
Therefore I stand by my original points that a) Microsoft paying a Mono developer $5,000 in no way demonstrates their alleged acceptance of open source or free software. That action fits perfectly into their patent FUD game (open source software whose usage you can license from Microsoft is good, and they'll even support its development)
There's a slight contradiction there. First you say it doesn't demonstrate their acceptance of FOSS, then you say it demonstrates their promotion of FOSS. In the second case, *Microsoft* FOSS, but still FOSS.
I think we're in violent agreement about this. In the context where this discussion started, namely motivations for contributing to FOSS software, it was claimed that idealism and religious fervour are the only reasons. I pointed out that many non-idealist and non-religious companies contribute to FOSS software. You've pointed out that Microsoft's support of FOSS is *extremely* nuanced and, dare I say it, Machiavellian:
"Microsoft FOSS Good, our competitors' FOSS Bad"
one might almost say. That's hardly support for FOSS on idealistic grounds, but a hard-nosed business tactic to defend their revenue stream from a threat. It displays an attitude that FOSS is not "a cancer" that need be destroyed, or a fringe movement from some hairy unwashed ex-UNIX dinosaurs and idealists that can be ignored, but something that will be around in the long term, and a real competitor to Microsoft that needs to be managed.
*That* was my point all along. I never suggested that Microsoft was a willing contributor to, say, the Linux kernel.
Oh wait...
http://www.microsoft.com/presspass/features/2009/jul09/07-20linuxqa.mspx http://www.linux-mag.com/id/7439
As I said, there are all sorts of reasons people contribute to FOSS projects, starting with "I have an itch that needs scratching", to "it will help prevent paying customers from dropping Windows Hyper-V virtualisation servers in favour of something else".
b) most software probably infringes on at least a couple of software patents out there (ones that will be held valid by at least one judge/ jury somewhere if they would be tried; besides, as long as a regular patent is not explicitly invalidated, it is assumed to be valid by virtue of having passed the examination process)
The second thing any competent patent defence would do (after trying to prove that your device is nothing like the patented device) is to try to invalidate the patent. Demonstrating that your device is based on prior art is not just a defence of the infringement claim, but could very well demonstrate that the patent isn't even valid because it does nothing new or different from the prior art.
As for the examination process... even the United States Patent Office is aware that the process isn't working and they grant patents too readily. They recently ran a trial "Peer-to-Patent" process, which invited people to research prior art for them. According to them, the average time each examiner has to research a patent is 18-20 hours:
http://www.peertopatent.org/getting_started
I suspect that's a self-serving over-estimate from the patent office. I remember reading that the more accurate figure is 5-6 hours per patent application, but I can't find the reference to it now so you'll just have to take that with a rather large grain of salt.
Am 05.08.2010 01:11, schrieb Steven D'Aprano:
On Thu, 5 Aug 2010 12:55:20 am Jonas Maebe wrote:
Therefore I stand by my original points that a) Microsoft paying a Mono developer $5,000 in no way demonstrates their alleged acceptance of open source or free software. That action fits perfectly into their patent FUD game (open source software whose usage you can license from Microsoft is good, and they'll even support its development)
There's a slight contradiction there. First you say it doesn't demonstrate their acceptance of FOSS, then you say it demonstrates their promotion of FOSS. In the second case, *Microsoft* FOSS, but still FOSS.
I think we're in violent agreement about this. In the context where this discussion started, namely motivations for contributing to FOSS software, it was claimed that idealism and religious fervour are the only reasons.
I wrote:
I think you miss one important point here: to contribute to an OSS project one must be rather idealistic or even religious about it (yes, this includes me ;))
I didn't say "only" but pointed out that it's a major motiviation for individuell persons ("one") and this is what I experienced in >15 years of free software and OSS development. It explains also perfectly why FPC is a living project (yes, we also look always for contributors but which OSS projects doesn't do so?) while GPC is currently in some zombie state.
Anyways, I hope GPC continues in some way so I don't have to implement additionally extended pascal support in FPC together with the iso7185 support (iso like goto not yet commited to svn, but almost finished) I added the last days (just to be prepared :)).
Florian Klaempfl wrote:
I wrote:
I think you miss one important point here: to contribute to an OSS project one must be rather idealistic or even religious about it (yes, this includes me ;))
I didn't say "only" but pointed out that it's a major motiviation for individuell persons ("one") and this is what I experienced in >15 years of free software and OSS development.
Since I'm currently in a parsing mindset, I have to point out that "must" in English implies a necessary condition, even without adding "only" (actually, "one must only ..." would mean something quite different).
Furthermore, "one" *can* be understood to refer to any person or group, not only individual persons (like in German: "man"), and I also understood it this way.
Frank
On Thu, 5 Aug 2010 05:04:05 pm Florian Klaempfl wrote:
I wrote:
I think you miss one important point here: to contribute to an OSS project one must be rather idealistic or even religious about it (yes, this includes me ;))
I didn't say "only" but pointed out that it's a major motiviation for individuell persons
You didn't say "major", you said "must". Not "usually are", or "can be", or "often are", or "sometimes are", or any other nuanced qualification. You made a sweeping generalisation that doesn't stand up to scrutiny.
I don't know if you are a native English speaker, it's possible that I have been fooled by your otherwise excellent English, and you actually are unaware that "must" implies certitude. To be a square, a geometric figure must have four equal sides and four equal angles. If that's the case, then we've both learned something.
("one") and this is what I experienced in >15 years of free software and OSS development. It explains also perfectly why FPC is a living project (yes, we also look always for contributors but which OSS projects doesn't do so?) while GPC is currently in some zombie state.
How does that explain this? Both GPC and FPC are FOSS software, aren't they? People who are idealistic about Pascal can be equally idealistic when using GPC as when using FPC, can they not?
If FPC is thriving while GPC is not, that probably reflects the fact that the overall Pascal community has shrunk to the point that there are not enough users to support two FOSS compilers (plus however many non-FOSS). The fact that Free Pascal has "won" (if it actually is a fact) is probably more to do with the fact that typing "free pascal" into Google is a more obvious search strategy than typing "gpc".
Am 06.08.2010 02:59, schrieb Steven D'Aprano:
You didn't say "major", you said "must". Not "usually are", or "can be", or "often are", or "sometimes are", or any other nuanced qualification. You made a sweeping generalisation that doesn't stand up to scrutiny.
I don't know if you are a native English speaker,
No.
("one") and this is what I experienced in >15 years of free software and OSS development. It explains also perfectly why FPC is a living project (yes, we also look always for contributors but which OSS projects doesn't do so?) while GPC is currently in some zombie state.
How does that explain this? Both GPC and FPC are FOSS software, aren't they? People who are idealistic about Pascal can be equally idealistic when using GPC as when using FPC, can they not?
The whole thread is about contributors and not users. FPC is self hosting so one being idealistic about pascal can also work in the compiler (and yes, several people working on FPC can maybe read C but coded seriously in C not even to talk about C++).
Anyways, the thread gets off topic.
On Sat, 31 Jul 2010, Florian Klämpfl wrote:
Am 31.07.2010 09:12, schrieb Frank Heckenbach:
One reason why successful free software projects thrive is contributions by users. This is true for big projects such as GNU and Linux (of the current source code in the GNU project and in Linux, RMS and Linus, respectively, have written only a small part) or TeX (though TeX itself is mostly Knuth's original code, what made it most useful was LaTeX and many 3rd party packages), but also for smaller projects, in particular also FPC.
With GPC this hasn't happened much. Even if working on the GPC compiler itself was difficult due to its backend dependency, this was no reason to prevent contributions to Pascal units, both the units that come with GPC, including most of its runtime system, and 3rd party units or applications (e.g. IDEs). But this happened only to a small degree. I can only speculate on why, whether it's mentality (as expressed in statements such as the above), or that everyone is just too busy working on their in-house applications, or even the large array of dialects that GPC supports (which are normally thought of as an asset, but may have become a problem in that they stifle cooperation between users if everybody uses their own set of features).
I think you miss one important point here: to contribute to an OSS project one must be rather idealistic or even religious about it (yes, this includes me ;)). But in general those people are not only religious about the license of the software but also in other regards of IT. So it's quite natural that someone who works on a OSS compiler considers also the language used to write the compiler as important. This applies even to the GUI: just look at the MSEGui or fpGUI (http://opensoft.homeip.net/fpgui/): people are writing complete GUI libraries in FPC (this goes beyond lazarus which wraps only GTK/win32/qt).
I think there are a variety of reasons why people contribute to free software, including the failure of the market to provide software that people want at prices they are willing to pay. If, for example, you want a Pascal compiler for your favorite computing platform and the major suspects don't think they can make a profit providing it, you might well think it worth your while to devote 2-4 hours a week developing or improving a free one, rather than have to switch languages or platforms (of course, you know this). I daresay the availability of free software has been a large part of the reason why OS/2 (as ECS) still survives a decade after IBM stopped supporting it. Idealism has inspired a lot of free software, but it's not the only motivation for contributing.
--------------------------| John L. Ries | Salford Systems | Phone: (619)543-8880 x107 | or (435)867-8885 | --------------------------|
John L. Ries wrote:
I think there are a variety of reasons why people contribute to free software, including the failure of the market to provide software that people want at prices they are willing to pay.
... or which works the way they want. That's a main reason for me. Whenever I use some proprietary software, including embedded software on "consumer electronics" devices, I notice various minor issues where I think: a 5-line (or less) change in the source code and this or that would work so much better or less annoying. But without source available, the simple change becomes impossible.
With free software, I just make the change and usually submit it back, if only so I don't have to make it again in the next version if it's accepted, so actually for quite egoistic reasons. (Though, incidentally or not, I find such kinds of problems more rarely in free software -- probably because someone else found and fixed them before me; while with non-free software, thousands or millions can find the same problem and whine on the internet about them, but they can't fix them, even if they'd like to take a little time to do it.)
(Of course, this doesn't mean there aren't also bigger issues in both free and non-free software -- in particular my involvement with GPC was caused by the rather major issue of BP's 16 bit and Dos limitations. But getting rid of thousands of minor nuisances is very helpful.)
Frank
On Sat, 2010-07-31 at 09:12 +0200, Frank Heckenbach wrote: [...]
Even if working on the GPC compiler itself was difficult due to its backend dependency, this was no reason to prevent contributions to Pascal units, both the units that come with GPC, including most of its runtime system, and 3rd party units or applications (e.g. IDEs). But this happened only to a small degree. I can only speculate on why, whether it's mentality (as expressed in statements such as the above), or that everyone is just too busy working on their in-house applications, or even the large array of dialects that GPC supports (which are normally thought of as an asset, but may have become a problem in that they stifle cooperation between users if everybody uses their own set of features).
Even if people are very busy with their own applications, they must be developing libraries of frequently-used routines, which would be of great benefit to the development community (I'm not talking here about proprietary code written for one's employers - I'm talking about one's own personal development efforts).
It may be that (as Florian Klämpfl said), not everyone is "religious" enough about the ideals of free software to want to contribute their own source code or their own time. Or it may be that it simply doesn't occur to people that the results of their own development efforts may actually be useful to others (or some may not be very confident about their own code (never stopped me! :-))). I would imagine (or at least hope) that it is not a result of selfishness (i.e., takers, not givers). I suspect that most users of GNU software are altruistic enough.
I generally try to make available to others units (with sources) that I have benefited from (either by converting/modifying existing public domain/free sources, or by writing from the scratch myself) because I assume (perhaps conceitedly) that, if I needed that code, then there must be someone else out there that would also need it. The problem is that, in such cases (especially with units) you never quite know whether anyone is actually using them - which can be a disincentive to continued development.
Conclusion: whether or not the project to renew GPC takes off, all programmers have piles and piles of routines for all sorts of things, that would be useful to others, and that could save someone weeks (or even months) of development effort. Perhaps we should all try to contribute our source code to a library bank?
Dear Chief,
I generally try to make available to others units (with sources) that I have benefited from
Well, I'm glad to do that, but I have not written to the list advertizing my libraries because I figure that would be close to bragging about my code.
So, all are welcome to my utility library, which provides matrix inversion using GPC's dynamic schema types, simplex fitting, and a bunch of other stuff.
http://alignment.hep.brandeis.edu/Software/Sources/utils.pas
The only documentation is in the comments, so there's another problem: when I tell people they can use my code, I have to tell them how to use it also, or else my offer is rather empty.
Yours, Kevan
On Sat, 2010-07-31 at 13:40 -0400, Kevan Hashemi wrote:
Dear Chief,
I generally try to make available to others units (with sources) that I have benefited from
Well, I'm glad to do that, but I have not written to the list advertizing my libraries because I figure that would be close to bragging about my code.
I doubt it. It is simply contributing the results of one's efforts to the world. The world may or may not need it, or want it - but it can't make that choice if it doesn't have the offer in the first place! And - one man's "bragging" is another man's life-saver.
So, all are welcome to my utility library, which provides matrix inversion using GPC's dynamic schema types, simplex fitting, and a bunch of other stuff.
http://alignment.hep.brandeis.edu/Software/Sources/utils.pas
[...]
Looks very useful, thanks. My suggestion is (at some point in time) to develop a repository of GPC units. BP used to have the SWAG archives, and there are all sorts of Delphi code repositories.
The only documentation is in the comments, so there's another problem: when I tell people they can use my code, I have to tell them how to use it also, or else my offer is rather empty.
All that is needed is a few example programs that show what could be done with some of the routines - e.g., the "highlights" (and, of course, how to do it). People can have a look at the examples, and then start to modify them to see whether the code can help them. I have personally found example programs (starting with the very basic, to intermediate, to complex) very useful when I have come across source code for units.
Hi,
On 7/31/10, Frank Heckenbach ih8mj@fjf.gnu.de wrote:
For what I know, of the 5 main developers GPC has had, the percentage of Windows users (as their main OS) is exactly 0%. So if your statistic was relevant, there should be dozens of Windows programmers just waiting to take over as main developers and GPC's future was secured for decades. Or maybe a lot of those 90% are just computer "consumers" who wouldn't notice if the OS, hardware and applications on their computer were exchanged completely, as long as their icons look the same -- and you wonder why they're not here to discuss the future of a Pascal compiler?
That's correct, they are just consumers. And anyways 90% of all statistics are made up! ;-) My brother's new laptop is Win64. No problem at all for him (migrating from 32-bit Vista) because his favorite stuff still works: mIRC, iTunes, etc. For me (who shuns all that stuff), it'd be more painful (although admittedly mostly due to bugs and no NTVDM, *sniff*).
I'd add that it's not only (or even mainly) the GNU project, but standards such as POSIX that foster portability. Linux supports POSIX (plus extensions), Windows also (grudgingly) supports POSIX (minus some flaws), so a port from Linux to Windows is not always trivial, but possible with some effort.
Windows doesn't support POSIX at all unless you meant Cygwin. They've long ago dropped the (wimpy) POSIX subsystem. The only reason (AFAICT) to use MinGW is if you don't need POSIX and just want "fast and simple" binaries or if you dislike the Cygwin .DLL license issue. Obviously it's very hard to port anything from Linux to Windows, so there are many projects that have either none or alpha / very incomplete Windows support (e.g. Bash, Go, Git, TCC, QEMU). Targeting POSIX too heavily can indeed hurt on the Windows side. (And I'm not saying Windows is better, it's worse!)
OTOH, native Windows programs use a completely different API, and a port from Windows to Linux is a much larger task, basically rewriting anything related to the OS (not only GUIs, also file I/O uses a completely different API). Most Windows programmers don't care about this, or at best only as an afterthought.
True, sadly. It's something you have to think about ahead of time (or else really know what you're doing, have lots of experience porting, etc).
Like it or not, Delphi is the best Pascal system going out there, pricey indeed; well worth it in my opinion.
Fortunately, opinions can differ. In my opinion, software that is non-free and non-portable is never "the best".
Best Pascal? GPC! :-) Best Win32 (and now or soon Win64?) heavily-tweaked, Pascal-ish GUI-based? Maybe Delphi.
For the sake of clarity in this discussion, could someone please make a glossary of the many acronyms therein. I do not recognize half of them.
I've yet to see a Windows user (who is not also a Linux user) supporting Linux-only features. And then I have to read (see above) how Windows is underrepresented and not talked about enough.
Well, if you're not also a Linux user, how could you even know how to support Linux?? Windows is definitely not completely underrepresented, but GNU does indeed target POSIX (or Linux) almost exclusively, and some GNU projects do indeed reject patches that don't fit that "ideal". So, it's true that Windows is supported, but they are not top tier (except maybe as a fluke sometimes by popularity).
So for those who think free software means they just get it without paying (free beer, not free speech) and yet get to tell us what we should do and complain when we do what we consider important, not what they consider important (instead of doing it themselves), that won't work.
Feedback is important. But sure, the one who codes decides. And programming isn't that easy.
Sure, one of the points of free software is that anyone can use it for any purpose, but there's also a rule: Those who code make the decisions.
:-))
And by this measure, if I look through the mailing list archives, Dos and Windows have actually been vastly overrepresented.
DOS overrepresented??? Nooooooo! Trust me, DOS is heavily shunned. It's not popular at all. I get a lot of flack for my support of it. And that's even with a GPL kernel + "BASE" (FreeDOS) + DJGPP w/ modern GCC. In some ways I consider it more portable (across OSes) and easy to use / install DOS software than Win32 or Linux. Or even easier to build / modify stuff. And yes, I know it's a losing battle ....
Win32 bores me with all its GUI and stupid technology (DirectX, .NET) although I am admittedly mostly a Windows "user". Only very very barely into Linux, but I don't think I'll ever be 100% "pro POSIX, GNU" etc. as long as their build processes and tools are so arcane and complex. In other words, I might use Linux more and more, but "in spirit" I will always be on the lookout to NOT be too *nix-oriented. It's not that I love Windows so much, just that *nix can be ugly at times. Besides, you can indeed ignore Windows (non-POSIX) if you want, but why would you want to?
I don't know, it's a mess. Even if everybody gets along (rare), everything's so complex or nobody has time. Portability is hard.
On Sat, Jul 31, 2010 at 05:37:59AM -0500, Rugxulo wrote:
The only reason (AFAICT) to use MinGW is if you don't need POSIX and just want "fast and simple" binaries or if you dislike the Cygwin .DLL license issue.
No, I use MinGW because it is AFAICT the only compiler to build Windows executables for which you don't have to have a Windows system. ;-)
Well, if you're not also a Linux user, how could you even know how to support Linux??
There is so much information on the Internet...
but GNU does indeed target POSIX (or Linux) almost exclusively, and some GNU projects do indeed reject patches that don't fit that "ideal".
I don't see this. It's quite the opposite. Almost all GNU programs are also available for Windows. So I surely doubt that they reject code that makes their software run under Windows.
Of course they reject code that does not run on a GNU system or gives the (wrong) impression that their system is inferior. But what are you expecting? Would Microsoft support software that doesn't run on their system? That would really be a surprise! In contrast to the GNU project they don't even try to make most of their software run on other systems. And if than almost always for "bait and switch" stunts, ie. get others on the hook and when they byte and depend on it, then drop the support. That was also the only reason for them to implement POSIX support - to switch some POSIX people over to Windows.
barely into Linux, but I don't think I'll ever be 100% "pro POSIX, GNU" etc. as long as their build processes and tools are so arcane and complex.
Have you tried KDevelop or Anjuta?
Hi,
On 7/31/10, Andreas K. Foerster list@akfoerster.de wrote:
On Sat, Jul 31, 2010 at 05:37:59AM -0500, Rugxulo wrote:
The only reason (AFAICT) to use MinGW is if you don't need POSIX and just want "fast and simple" binaries or if you dislike the Cygwin .DLL license issue.
No, I use MinGW because it is AFAICT the only compiler to build Windows executables for which you don't have to have a Windows system. ;-)
I meant vs. Cygwin. In other words, I see a lot of MinGW projects. Since both use GCC, you should "in theory" be able to cross compile from any valid host (even DJGPP). In practice, that doesn't always work (or is hard to do, at least for me).
OpenWatcom comes with libs and bins for (almost) any host or target, so the DOS-hosted version can by default target Win32, OS/2, Linux, etc. (Linux support is still considered experimental, but it seems to work okay, just no shared libs and no 64-bit.) Unlike TCC/Win32 or MinGW, it doesn't need MSVCRT (which I dislike, but anyways...).
Well, if you're not also a Linux user, how could you even know how to support Linux??
There is so much information on the Internet...
I mean how would somebody who doesn't "use" Linux a lot even have the imagination to write Linux-only features?? And a lot of the Internet is inaccurate, outdated, or disappears (bad links), argh!
but GNU does indeed target POSIX (or Linux) almost exclusively, and some GNU projects do indeed reject patches that don't fit that "ideal".
I don't see this. It's quite the opposite. Almost all GNU programs are also available for Windows. So I surely doubt that they reject code that makes their software run under Windows.
Not sure, I don't have a list offhand. I know a lot of Windows ports are unfinished or buggy. Sure, a lot of it works too. But I think GNU intentionally doesn't focus as much on Win32 because it's proprietary. (And yes, a few silly projects do reject patches out of spite.) It depends on the maintainers.
Of course they reject code that does not run on a GNU system or gives the (wrong) impression that their system is inferior. But what are you expecting? Would Microsoft support software that doesn't run on their system? That would really be a surprise! In contrast to the GNU project they don't even try to make most of their software run on other systems. And if than almost always for "bait and switch" stunts, ie. get others on the hook and when they byte and depend on it, then drop the support. That was also the only reason for them to implement POSIX support - to switch some POSIX people over to Windows.
MS is a 90,000+ employee company! Yes, they do a lot of dumb things, don't ask me why. I definitely don't agree with them much. They have so many bugs, dumb decisions, insane ads, high prices, but whatever, it's beyond me. I'm no huge Windows fan, and I don't want to really waste much time focusing on their dumb APIs.
Call me lazy or dumb myself, but I'm somewhat behind in Linux, so it's all fairly Greek to me. Even a lot of their stuff is too complex or I dislike. But hey, whatever, can't please everyone. There is no perfect OS.
(I'm on Linux now because XP hosed itself on this P4 oldie a few months ago. I've tried various distros off and on for three years. Always some few peculiar bugs and a few questionable decisions but otherwise okay. I'm no hardcore "user", though, don't need dual monitors or iTunes or Blu-Ray or AutoCad or PhotoShop or ....)
barely into Linux, but I don't think I'll ever be 100% "pro POSIX, GNU" etc. as long as their build processes and tools are so arcane and complex.
Have you tried KDevelop or Anjuta?
No, and sorry, that's not what I meant. ;-) I meant that some things are impossible to build or have horrible dependencies. In other words, I wonder what they were thinking! (This applies to any OS, lots of hard-to-reproduce builds from source code.)
P.S. I don't know, how do you decide what to support and what not to support without pissing someone off? I say the more the merrier, but I know it's hard to do.
On Sat, Jul 31, 2010 at 06:07:41PM -0500, Rugxulo wrote:
The only reason (AFAICT) to use MinGW is if you don't need POSIX and just want "fast and simple" binaries or if you dislike the Cygwin .DLL license issue.
No, I use MinGW because it is AFAICT the only compiler to build Windows executables for which you don't have to have a Windows system. ;-)
I meant vs. Cygwin. In other words, I see a lot of MinGW projects.
I also meant versus Cygwin. I am not aware that Cygwin could be used as a cross compiler. I know no distribution that has a Cygwin cross compiler, but many have a MinGW cross compiler.
Since both use GCC, you should "in theory" be able to cross compile from any valid host (even DJGPP). In practice, that doesn't always work
I do crosscompile my software... with MinGW... in practice.
(or is hard to do, at least for me).
Well, while the distributions come with MinGW, but you still often have to install the needed libraries for cross compiling. That can be some hard work. But once it is done, it is done.
Line oriented programs which do not need special libraries, work out of the box.
Well, afaik Fedora comes with a lot of Windows libraries packaged for crosscompiling.
No, and sorry, that's not what I meant. ;-) I meant that some things are impossible to build or have horrible dependencies. In other words, I wonder what they were thinking! (This applies to any OS, lots of hard-to-reproduce builds from source code.)
If you are new to GNU/Linux, stay with the packages that the distribution offers. That's how it's meant to be. Manual installation of software should be the exception.
On Sun, 1 Aug 2010, Andreas K. Foerster wrote:
On Sat, Jul 31, 2010 at 06:07:41PM -0500, Rugxulo wrote:
The only reason (AFAICT) to use MinGW is if you don't need POSIX and just want "fast and simple" binaries or if you dislike the Cygwin .DLL license issue.
No, I use MinGW because it is AFAICT the only compiler to build Windows executables for which you don't have to have a Windows system. ;-)
I meant vs. Cygwin. In other words, I see a lot of MinGW projects.
I also meant versus Cygwin. I am not aware that Cygwin could be used as a cross compiler. I know no distribution that has a Cygwin cross compiler, but many have a MinGW cross compiler.
Cygwin actually comes with an optional set of MinGW libraries. I normally use Cygwin GCC and friends to create MinGW execs since I have a hard time working on Windows without Cygwin anymore (the X server is addictive) and I don't really want to have to maintain a separate MinGW/MSYS installation on my Windows box.
--------------------------| John L. Ries | Salford Systems | Phone: (619)543-8880 x107 | or (435)867-8885 | --------------------------|
Rugxulo wrote:
I've yet to see a Windows user (who is not also a Linux user) supporting Linux-only features. And then I have to read (see above) how Windows is underrepresented and not talked about enough.
Well, if you're not also a Linux user, how could you even know how to support Linux??
By asking Linux users exactly what they want and listen to their complaints when you don't cater to them enough. Sounds strange? Yet in the other direction, this is exactly what happens (see the comments I responded to).
Sure, one of the points of free software is that anyone can use it for any purpose, but there's also a rule: Those who code make the decisions.
:-))
And by this measure, if I look through the mailing list archives, Dos and Windows have actually been vastly overrepresented.
DOS overrepresented??? Nooooooo! Trust me, DOS is heavily shunned.
I said by this measure. Maurice is basically the only active Dos contributor (and he doesn't work on the compiler itself, but on building, packaging and units/libraries).
Besides, you can indeed ignore Windows (non-POSIX) if you want, but why would you want to?
Counter-question: Why should I want to support it. I see 3 possible reasons:
- It comes automatically (e.g., ANSI C functions, POSIX subsystem)
- I have some personal use for it (I don't).
- Someone pays me to do it.
Rugxulo wrote:
No, and sorry, that's not what I meant. ;-) I meant that some things are impossible to build or have horrible dependencies. In other words, I wonder what they were thinking! (This applies to any OS, lots of hard-to-reproduce builds from source code.)
Of course, this applies to any software, also whether free or proprietary (though the latter is obviously not seen by the public, but on the net you often read about proprietary developers moaning about their incomprehensible and difficult builds).
But with free software you can do something about it. In fact, when I find such problems in other software that I'm going to build more than once, usually one of the first things I do is to simplify the build process, as far as possible -- of course, it's not always possible, e.g. with GPC that depends on GCC we have to integrate into GCC's build system (though that's not so bad), and it's required to get both GPC and GCC sources and patch the latter.
And by submitting those changes I hopefully make building easier for everyone else, e.g. in GPC, quite early during my involvement, I made the GCC patch applied at least semi-automatically (including automatic backend version detection), or for GRX I wrote a configure script to avoid having to edit various files for each build.
Of course, there are many packages, so if you come across build problems (just like any other bugs), do something about it. I don't think their developers make the build hard intentionally, they just choose to spend their time this way or another (i.e., if they worked more on the build process, other features might be neglected). They can't please anyone, and the only ways to get those issues solved that you care about are do to it yourself, pay someone to do it, or just hope and wait ...
P.S. I don't know, how do you decide what to support and what not to support without pissing someone off? I say the more the merrier, but I know it's hard to do.
That's what we did in GPC (WRT platforms, dialects and other features). Of course, it makes maintenance more difficult, and it divides the user base which, as we see now, might be a serious problem for continued development, especially if the user base is not so large to begin with.
Prof Abimbola Olowofoyeku (The African Chief) wrote:
Looks very useful, thanks. My suggestion is (at some point in time) to develop a repository of GPC units. BP used to have the SWAG archives, and there are all sorts of Delphi code repositories.
A good idea. However, someone has to do it. We tried it half-heartedly some years ago, but (see above) the time we spent as webmasters was time we couldn't spend on compiler development. So if there are so many who'd like to contribute but are daunted by the compiler internals as we always hear, why don't they do something like this? (Even it probably would have been more effective 10 years ago than now ...)
BTW, I hope it would be better than SWAG. I used to go there sometimes back when I used BP, but I found the average code quality quite low, and also much duplication. E.g., I remember there were lots of "CRT extension units". Most if not all of them were buggy, incomplete (did not even support many features of BP's CRT which they were meant to extend) and their extensions were often rather peculiar (single-purpose, not in the style of similar functions from CRT or just incomprehensible). So I think some sort of feedback, and willingness of authors to listen to it or let others do what's needed, would seem helpful.
Frank
On Sun, Aug 01, 2010 at 11:07:26PM +0200, Frank Heckenbach wrote:
Looks very useful, thanks. My suggestion is (at some point in time) to develop a repository of GPC units. BP used to have the SWAG archives, and there are all sorts of Delphi code repositories.
A good idea. However, someone has to do it. We tried it half-heartedly some years ago, but (see above) the time we spent as webmasters was time we couldn't spend on compiler development. So if there are so many who'd like to contribute but are daunted by the compiler internals as we always hear, why don't they do something like this? (Even it probably would have been more effective 10 years ago than now ...)
Why not simply use a wikipage that anyone can edit? I have made a start in the (German) Linux-Wiki: http://linuxwiki.org/GnuPascal/Software
On 02/08/2010, Andreas K. Foerster list@akfoerster.de wrote:
On Sun, Aug 01, 2010 at 11:07:26PM +0200, Frank Heckenbach wrote:
Looks very useful, thanks. My suggestion is (at some point in time) to develop a repository of GPC units. BP used to have the SWAG archives, and there are all sorts of Delphi code repositories.
A good idea. However, someone has to do it. We tried it half-heartedly some years ago, but (see above) the time we spent as webmasters was time we couldn't spend on compiler development. So if there are so many who'd like to contribute but are daunted by the compiler internals as we always hear, why don't they do something like this? (Even it probably would have been more effective 10 years ago than now ...)
Why not simply use a wikipage that anyone can edit? I have made a start in the (German) Linux-Wiki: http://linuxwiki.org/GnuPascal/Software
Good idea. I've just added TeX to the collection.
Hi,
On Sat, Jul 31, 2010 at 06:07:41PM -0500, Rugxulo wrote:
But I think GNU intentionally doesn't focus as much on Win32 because it's proprietary.
There seems to be some misconception here on what "GNU" is - there is no "GNU overlord" that tells software developers what to do and what not to do.
Open Source developers do what they need to get their work done, what they find interesting to do, sometimes what they think need to be done, and possibly because they are paid to do some stuff they are not otherwise interested in.
So unless someone waves with a heap of money, most Open Source programmers prefer to work on environments that they are familiar with and that are easy to program for - and that's (most of the time) not windows.
gert
On Sun, Aug 01, 2010 at 12:55:02PM +0200, Gert Doering wrote:
On Sat, Jul 31, 2010 at 06:07:41PM -0500, Rugxulo wrote:
But I think GNU intentionally doesn't focus as much on Win32 because it's proprietary.
There seems to be some misconception here on what "GNU" is - there is no "GNU overlord" that tells software developers what to do and what not to do.
Of course there is a GNU overlord: his name is Richard M. Stallman. He is the only one who can decide whether something is GNU software, or not. And he does tell developers what to do and what not, that's for sure!
For more information see http://www.gnu.org/help/evaluation.html
Open Source developers do what they need to get their work done,
Who talked about "Open Source" here? And what does it have to do with GNU software?
On 02/08/2010, Andreas K. Foerster list@akfoerster.de wrote:
On Sun, Aug 01, 2010 at 12:55:02PM +0200, Gert Doering wrote:
On Sat, Jul 31, 2010 at 06:07:41PM -0500, Rugxulo wrote:
But I think GNU intentionally doesn't focus as much on Win32 because it's proprietary.
There seems to be some misconception here on what "GNU" is - there is no "GNU overlord" that tells software developers what to do and what not to do.
Of course there is a GNU overlord: his name is Richard M. Stallman. He is the only one who can decide whether something is GNU software, or not. And he does tell developers what to do and what not, that's for sure!
[...]
Open Source developers do what they need to get their work done,
Who talked about "Open Source" here? And what does it have to do with GNU software?
Seems to me that there is some confusion underlying here: the GNU Project aims to produce an Open Source (more precisely, a "Free" as in speech) operating system and set of applications, and it has an overlord, namely: Richard Stallman. On the other hand, the GNU Project designed the GNU GPL (General Public License) to release the software, to ensure that the OS and friends remain forever "free" (as in speech). Now, a lot of OSS is release under GPL, to ensure "openness", but doesn't belong to the GNU project; the released software has then no overlord, save for the individual developer(s). They are not synonymous, since there are other licenses useful to release OSS.
Now, as far as GPC is concerned, I'm not positive about its status: has it been acknowledged as part of the GNU project, or is it only released as OSS, with GCC as its backend compiler?
Cheers,
On Mon, Aug 02, 2010 at 04:47:02PM -0500, Luis Rivera wrote:
Seems to me that there is some confusion underlying here: the GNU Project aims to produce an Open Source (more precisely, a "Free" as in speech) operating system and set of applications, and it has an overlord, namely: Richard Stallman. On the other hand, the GNU Project designed the GNU GPL (General Public License) to release the software, to ensure that the OS and friends remain forever "free" (as in speech). Now, a lot of OSS is release under GPL, to ensure "openness", but doesn't belong to the GNU project; the released software has then no overlord, save for the individual developer(s). They are not synonymous, since there are other licenses useful to release OSS.
That's almost correct. Just one little comment: The GNU project distances itself from the term "Open Source".
Stallman: | [...] But we want people to know we stand for freedom, so we do not accept | being mislabeled as open source supporters. http://www.gnu.org/philosophy/open-source-misses-the-point.html
Well, not all in the Free Software movement agree with Stallman in this. But I see his point.
Apart from that, I still think we talked about GNU software and not generaly about Free Software or Open Source. At least I did.
I am so well aware of this distinction, because my software, which is hosted by the FSF, has to cary the "nongnu" moniker... Well, I can live with that.
On Sat, Jul 31, 2010 at 09:12:14AM +0200, Frank Heckenbach wrote:
With GPC this hasn't happened much. Even if working on the GPC compiler itself was difficult due to its backend dependency, this was no reason to prevent contributions to Pascal units, both the units that come with GPC, including most of its runtime system, and 3rd party units or applications (e.g. IDEs). But this happened only to a small degree. I can only speculate on why, whether it's mentality (as expressed in statements such as the above), or that everyone is just too busy working on their in-house applications, or even the large array of dialects that GPC supports (which are normally thought of as an asset, but may have become a problem in that they stifle cooperation between users if everybody uses their own set of features).
Or those units are just not well known. ;-)
So may I take the chance to remind you, that I still offer GPC and FPC bindings for my project AKFAvatar.
AKFAvatar is a graphical frontend for programs. An avatar appears on the screen and tells things written in a balloon. Writing programs for AKFAvatar is as easy as writing for the command line, but it looks much better.
I have written AKFAvatar mainly, because I myself do not like GUIs very much. But many people hate command line tools (they still identify the command line with an ancient operating system that was very limited in its capabillities... you know what I mean.) So I have written AKFAvatar as a compromise between both worlds...
P.S.: I have chosen the name "AKFAvatar" long before I ever heard of the film "Avatar". So it has absolutely nothing to do with that.
P.P.S.: I switched to other programming languages and don't do much with Pascal anymore. Pascal also has the problem that many still identify it with a long outdated implementation, and because of that they think that Pascal is lame. I think Pascal is much better than C, but I wouldn't dare anymore to say that in front of people who don't know Pascal well enough.
On Sat, 2010-07-31 at 13:48 +0200, Andreas K. Foerster wrote:
On Sat, Jul 31, 2010 at 09:12:14AM +0200, Frank Heckenbach wrote:
With GPC this hasn't happened much. Even if working on the GPC compiler itself was difficult due to its backend dependency, this was no reason to prevent contributions to Pascal units, both the units that come with GPC, including most of its runtime system, and 3rd party units or applications (e.g. IDEs). But this happened only to a small degree. I can only speculate on why, whether it's mentality (as expressed in statements such as the above), or that everyone is just too busy working on their in-house applications, or even the large array of dialects that GPC supports (which are normally thought of as an asset, but may have become a problem in that they stifle cooperation between users if everybody uses their own set of features).
Or those units are just not well known. ;-)
Indeed :)
So may I take the chance to remind you, that I still offer GPC and FPC bindings for my project AKFAvatar.
AKFAvatar is a graphical frontend for programs. An avatar appears on the screen and tells things written in a balloon. Writing programs for AKFAvatar is as easy as writing for the command line, but it looks much better.
Looks like it could be useful. Any screen shots to see what it actually looks like in real usage?
On Sat, Jul 31, 2010 at 09:54:49PM +0100, Prof Abimbola Olowofoyeku (The African Chief) wrote:
So may I take the chance to remind you, that I still offer GPC and FPC bindings for my project AKFAvatar.
AKFAvatar is a graphical frontend for programs. An avatar appears on the screen and tells things written in a balloon. Writing programs for AKFAvatar is as easy as writing for the command line, but it looks much better.
Looks like it could be useful. Any screen shots to see what it actually looks like in real usage?
Only the one on the homepage... (click to enlarge)
Okay, I just said that it looks much better than a text-console, I did not say that it looks great. I know that I'm not a good graphic designer.
There are some other Avatar-images in the package and you can use your own one...
There are also many features, that you could not see in a screenshot anyway. For example you can activate a slowprint mode, in which the text is displayed letter by letter. The Avatar can move in or out. You can show images or play sounds. You can make menus or ask yes/no questions (both usable with the mouse). There is a pager, with which you can scroll through long texts, like "less", but you can also use the mouse-wheel...
Gee, I even implemented a full blown terminal emulation, in which you can simply run existing programs; and not only line-oriented ones but also ncurses based programs.... But that is only available for POSIX systems. (Tested on GNU/Linux and FreeBSD.)
On Sat, 31 Jul 2010 05:12:14 pm Frank Heckenbach wrote:
Prof. Harley Flanders wrote:
As to O/S, the following statistics should interest everyone:
OS Platform Statistics
Windows XP is the most popular operating system. The Windows family counts for almost 90%:
2010 Win7 Vista Win2003 WinXP W2000 Linux Mac June 19.8% 11.7% 1.3% 54.6% 0.4% 4.8% 6.8% May 18.9% 12.4% 1.3% 55.3% 0.4% 4.5% 6.7% April 16.7% 13.2% 1.3% 56.1% 0.5% 4.5% 7.1% March 14.7% 13.7% 1.4% 57.8% 0.5% 4.5% 6.9% February 13.0% 14.4% 1.4% 58.4% 0.6% 4.6% 7.1% January 11.3% 15.4% 1.4% 59.4% 0.6% 4.6% 6.8%
Except that browsers are not exactly the same thing as programming languages.
Exactly.
I work for a Linux consulting company. Apart from the guy who insists on using Lynx as his browser, everyone I know sets their browser user-agent to pretend to be Internet Explorer on Windows XP. I suspect we're probably counted as part of the 54.6%, and I'm pretty sure we aren't the only ones.
But be that as it may, what's important is not the percentage of OS users all up, but the percentage of OS users that are interested in a Pascal compiler.
[...]
Hodges, Robert CTR USAF AFMC 520 SMXS/MXDEC wrote:
Walk outside into the real world (which you eventually will, like it or not), however, and UNIX/MacOS/Linux systems virtually disappear.
Only 1 out of 10 people want/need to use it.
If you think that 1 in 10 is so vanishingly small that it doesn't matter, that it isn't part of the "real world", I invite you to consider 1 in 10 of the people you know disappearing without trace.
Or consider that Pizarro destroyed the Inca Empire with an army less than 0.00125% their size (192 men against 16 million); or that the population of the USA is less than 5% of that of the world. That's the real world.
Frank Heckenbach wrote:
Even if working on the GPC compiler itself was difficult due to its backend dependency, this was no reason to prevent contributions to Pascal units
I would have offered to contribute, but I assumed that there were so many experts working on it that I would be told No Thank You. But I'm saying now: I'm in. I will look at the GPC internals page and try to see what's entailed in maintaining the inerface with GCC, because one way to proceed is simply to find enough people to bring the GCC interface up to date.
Yours, Kevan
Kevan Hashemi wrote:
Frank Heckenbach wrote:
Even if working on the GPC compiler itself was difficult due to its backend dependency, this was no reason to prevent contributions to Pascal units
I would have offered to contribute, but I assumed that there were so many experts working on it that I would be told No Thank You. But I'm saying now: I'm in. I will look at the GPC internals page and try to see what's entailed in maintaining the inerface with GCC, because one way to proceed is simply to find enough people to bring the GCC interface up to date.
Doing that is a very difficult and never-ending task, because the back-end will keep changing. It is more time-ineffcient to go another way, e.g. to rewrite the compiler to produce intermediate C++ or (what I would prefer) intermediate LLVM assembly code.
Regards,
Adriaan van Os
Hi Frank,
On Tue, Jul 27, 2010 at 12:16:43AM +0200, Frank Heckenbach wrote:
I don't really know how many of you currently use GPC, and to what extent and in which ways, e.g., do you use it just to maintain some legacy code, or are you actively writing new applications?
JFTR, we use it to maintain legacy code that has been written 30+ years ago but is still in production use. This is dying off, though, as most of the old ASCII based application is moving to a new Java based GUI.
So while we're extremely happy that GPC is there, and is working on AIX (while IBM's pascal is no longer supported), we expect that our pascal usage will disappear over the next few years. So for us, "maintaining status quo" is sufficient.
But let me use this opportunity to say a big thank you to you, and to the other GPC developers - it was really good for us to have a different option than IBM, do change our software in our own pace.
gert
Hi,
On 8/1/10, Gert Doering gd@medat.de wrote:
JFTR, we use it to maintain legacy code that has been written 30+ years ago but is still in production use. This is dying off, though, as most of the old ASCII based application is moving to a new Java based GUI.
So while we're extremely happy that GPC is there, and is working on AIX (while IBM's pascal is no longer supported), we expect that our pascal usage will disappear over the next few years. So for us, "maintaining status quo" is sufficient.
Have you tried Canterbury Pascal? It outputs Java, but it's not free in any sense. (Vector Pascal runs on Java but outputs code to compile by GCC natively.) But neither is as good as GPC, obviously.
I'm not at all familiar with or in love with Java (too bulky ... but a GPL version [IcedTea] does exist, seemingly used heavily by Red Hat), but I do like the idea of a converter between languages.
There's also the possibility of using the Parrot VM (partially used by the Perl project). But I don't know if that's stable enough either.
Excuse my bullet point style reply - v. busy working late at night, etc. Haven't read all the discussion, as there is too much for my limited time. (Sorry.)
- My interest in GPC is that I wrote most of the software of my Ph.D. thesis in Pascal. GPC (along with Metrowerks Pascal), kept it alive.
- I wish I could say that I had programming time to offer, but I really have to keep my time focused on my consulting work. (Mortgages, etc., do that to you...)
- A work-around solution for *my* personal use would to have a development line against one or two "frozen" OS(es) that can be virtualised easily on the current platforms (e.g. selected versions of Linux + gcc + the other tools needed).
If it's not clear, what I am thinking of is running (say) a version of Linux as a virtual machine on (in my case) Mac OS X and working with GPC from within that. (Ideally the GPC project would host the core VM images.)
Bear in mind that I don't care much which OS does the job as long as my old code runs and that I'm helped by that all my Pascal work was text-only, so I have no graphics concerns. (Although some of it generates PostScript, but that's my problem!)
Grant
Hi everybody,
since GPC development has mostly stalled, I've thought about if and how its development could be continued. Here are my current thoughts about it. Since the text is quite long, I put it on the web: http://fjf.gnu.de/gpc-future.html
As I write there, I don't see much future in the current way of developing GPC, but also alternative development models will not be a task for a single person. In other words, without some new contributors, there is probably no chance for GPC development to continue.
I don't really know how many of you currently use GPC, and to what extent and in which ways, e.g., do you use it just to maintain some legacy code, or are you actively writing new applications?
So in order to tell whether continuing GPC development is worthwhile, I'd like to know who of you would actually care about major new features in GPC (as opposed to just preserving the status quo), and who would be interested not only in using GPC, but also supporting its continued development, either by actively contributing to it, or -- perhaps in the case of companies that use GPC -- by paying for continued development.
If it turns out there is no sufficient amount of interest, I'm afraid to say it's probably better to put an end to it now rather than further dragging along. (Of course, the existing GPC versions will continue to be available, and anyone who wants to can use and modify them, which the GPL already guarantees, but without prospects for the future, I would then retire from GPC development and start to rewrite my own Pascal programs in other languages.)
Frank
-- Frank Heckenbach, f.heckenbach@fh-soft.de, http://fjf.gnu.de/, 7977168E GPC To-Do list, latest features, fixed bugs: http://www.gnu-pascal.de/todo.html GPC download signing key: ACB3 79B2 7EB2 B7A7 EFDE D101 CD02 4C9D 0FE0 E5E8