[fpc-devel] type discussion

Sebastian Kaliszewski sebaska at melkor.mimuw.edu.pl
Thu Jun 2 17:53:11 CEST 2005


Vinzent Hoefler wrote:
> On Thursday 02 June 2005 13:54, Sebastian Kaliszewski wrote:
> 
> No, they aren't (or let me put it this way: It depends on what you 
> measure). For instance, studies indicate that there are ten times more 
> errors code in C code then in Ada code once you've delivered the 
> software.
> 

If you didn't notice I wrote "C nothwithstanding". C is far away from modern 
high level languages (and also many older than it but hight level languages).


>>The studies show that in high level languages (C nothwithstanding)
>>there is very evident but simple correlation -- number of programmer
>>errors per language construct (typically in not obfuscated code it's
>>very close to the number of not empty & non comment source lines) is
>>independent of the language.
> 
> 
> You must have read different studies. :) 

Certainly. Those I read were about mainsteream software development not 
niche life critical realitime systems. The study was measuring error rate in 
some typical and rather simple prgramming task coded in varius languages by 
varius programmers. Correlation found was clear -- number of bugs per 1kloc 
were constatant and independent of the language. The difference was that the 
same task could be coded in some languages using significantly lesser number 
of lines.


> A while ago Lockheed Martin 
> switched to SPARK code (which is a statically proven version of Ada) 
> and they claimed this saved them about 80% of development cost compared 
> with former projects of the same size (and we're talking about a flight 
> control system of five million lines of code here).

Two things:
1. They'd better claim they got significant gains or someone should be fired 
for wasting money. IOW I take all such corporation made studies not with a 
grain but with a shuffle of salt.
2. You're talking about complex life-critical realtime system, not 
mainstream software. In such system the cost of bugs is many orders of 
magnitude higher than in mainstream. Cost structure of such projectes 
differs stronly from mainstream programming.


>>C++ (pre stl) was worst there, Python, Perl & likes were the best,
>>Java was not that bad but still about 30% worse thatn Python (C++ was
>>100% i.e 2x worse).
> 
> 
> What did they actually measure? Which languages did they compare. The 
> list above is all (except Python) more or less C-ish or even worse 
> syntax.

There was a bunch of popular languages (don't remember them all), C like 
were C++, Java & Perl. Others certainly were not.


>>The biggest boost to effectivanes was bring by
>>introducing automated memory management (i.e. getting rid of explicit
>>memory freeing).
> 
> 
> Which is something you definitely don't want in large scale realtime 
> systems.

But FPC is useless in such systems as well. Besides lack of convincing 
argumentation about compiler correctness, you need real time guarantees wrt 
allocations and other library stuff (which FPC does not provide).


>>So even languages with ugly C-ish syntax like "Perl
>>the awful" can beat otherwise elegant & clean languages.
> 
> 
> Of course they can under certain circumstances. Just as a bubble sort 
> can beat a quick sort algorithm if you feed the "right" input.

If you'd use not the "naive" version of Qsort, but something more elaborate 
then you'd have to caerefully craft contrived input to cause it to work 
badly. But naive Qsort will work terribly on really common input.

Those examples were real code not some contrived stuff.


>>Hence
>>probably the greaytest impact on Objective Pascal productioveness
>>would come from allowing programmers to declare classes like self
>>managing (self freeing, not needeing explicit destructions).
> 
> 
> Maybe, yes. But I'm old school. I like to handle memory myself.

Well, I met some people who needed 'full controll' so they wnated to code in 
assembly.


> And I 
> still need less than 2 MB while a Java VM is still loading its 100 MB 
> foot print into memory. ;->

Java is a different story, but GC overhead is not that bad in some better 
languages (2-3x). And performance seems to be unaffected:

http://shootout.alioth.debian.org/benchmark.php?test=all&lang=fpascal&lang2=ocaml&sort=fullcpu

Here comes Pascal (FPC) vs Ocaml (hybrid functional/imperative language with 
OO). Ocaml is significantly faster although it havily uses GC. Ocaml code is 
significantly shorter too (they can be coded in significanly lesser amount 
of lines).

rgds
-- 
Sebastian Kaliszewski




More information about the fpc-devel mailing list