[fpc-devel] type discussion
Vinzent Hoefler
JeLlyFish.software at gmx.net
Fri Jun 3 09:44:15 CEST 2005
On Thursday 02 June 2005 15:53, Sebastian Kaliszewski wrote:
> Vinzent Hoefler wrote:
> > On Thursday 02 June 2005 13:54, Sebastian Kaliszewski wrote:
> >
> > No, they aren't (or let me put it this way: It depends on what you
> > measure). For instance, studies indicate that there are ten times
> > more errors code in C code then in Ada code once you've delivered
> > the software.
>
> If you didn't notice I wrote "C nothwithstanding". C is far away from
> modern high level languages (and also many older than it but hight
> level languages).
It's not much worse than C++ in typical projects (which has, I admit,
much to do with the people using it).
> >>The studies show that in high level languages (C nothwithstanding)
> >>there is very evident but simple correlation -- number of
> >> programmer errors per language construct (typically in not
> >> obfuscated code it's very close to the number of not empty & non
> >> comment source lines) is independent of the language.
> >
> > You must have read different studies. :)
>
> Certainly. Those I read were about mainsteream software development
> not niche life critical realitime systems.
You know that about 90% of all installed computer(chip)s are actually
embedded systems? So I'd consider this "mainstream software" a
niche. :-)
> The study was measuring
> error rate in some typical and rather simple prgramming task coded in
> varius languages by varius programmers. Correlation found was clear
> -- number of bugs per 1kloc were constatant
What figures did they give? I'm curious.
> and independent of the
> language. The difference was that the same task could be coded in
> some languages using significantly lesser number of lines.
Which is quite irrelevant once you've got to maintain it.
> Two things:
> 1. They'd better claim they got significant gains or someone should
> be fired for wasting money. IOW I take all such corporation made
> studies not with a grain but with a shuffle of salt.
Even the salt doesn't taste away the fact that they were on-time,
on-budget and testing the software more or less deviated to simply
showing that it works. And considering that most money is spend in the
testing phase (well, of course, especially in that particular domain),
the 80% aren't that surprising after all.
> 2. You're talking about complex life-critical realtime system, not
> mainstream software.
Yes, true. Funny thing is, that those companies are now almost able to
deliver software cheaper than the mainstream. If they would cut the
testing completely, they definitely could and the software wouldn't
even be as worse. (BTW, that's my own observation, not some study.)
> In such system the cost of bugs is many orders
> of magnitude higher than in mainstream. Cost structure of such
> projectes differs stronly from mainstream programming.
Yes. The main costs goes into testing. Which, in main stream software,
is done by the customer and he even pays for the bugfix called upgrade.
So, of course, testing is a bad thing. You can earn less money with it
and it even costs you.
Is that what you're trying to tell me?
> >>C++ (pre stl) was worst there, Python, Perl & likes were the best,
> >>Java was not that bad but still about 30% worse thatn Python (C++
> >> was 100% i.e 2x worse).
> >
> > What did they actually measure?
I'd guess, in that test, languages like Pascal or Ada would get an even
worse rating than C. Because these are languages that focus on design
and doing it right instead of focusing on doing it fast. Which, in
practice, means you get compilabe code much later, but in the end this
code is more likely to be correct. So it all depends on what you
measure.
> > Which languages did they compare.
> > The list above is all (except Python) more or less C-ish or even
> > worse syntax.
>
> There was a bunch of popular languages (don't remember them all), C
> like were C++, Java & Perl. Others certainly were not.
Did they take maintenance into account? (I just ask, because that's what
I am being paid for for the last 5 years and I guess, I am not the only
one out there who gets paid for this kind of development).
> >>The biggest boost to effectivanes was bring by
> >>introducing automated memory management (i.e. getting rid of
> >> explicit memory freeing).
> >
> > Which is something you definitely don't want in large scale
> > realtime systems.
>
> But FPC is useless in such systems as well.
Not necessarily.
> Besides lack of
> convincing argumentation about compiler correctness,
Realtime doesn't necessarily mean correctness. A while ago I talked to
one of the developers working for a electronic company who wrote the
code for a television. He told me, that in the final system, there
would be about 140 running threads. That's definitely not small scale
and judging on how often I had to reboot my Nokia phone by removing the
batteries, it doesn't mean correctness either.
> you need real
> time guarantees wrt allocations and other library stuff (which FPC
> does not provide).
No. Simple rule: Don't allocate at runtime.
> >>So even languages with ugly C-ish syntax like "Perl
> >>the awful" can beat otherwise elegant & clean languages.
> >
> > Of course they can under certain circumstances. Just as a bubble
> > sort can beat a quick sort algorithm if you feed the "right" input.
>
> If [...]
>
> Those examples were real code not some contrived stuff.
And for how many years they had to maintain that code?
> >>Hence
> >>probably the greaytest impact on Objective Pascal productioveness
> >>would come from allowing programmers to declare classes like self
> >>managing (self freeing, not needeing explicit destructions).
> >
> > Maybe, yes. But I'm old school. I like to handle memory myself.
>
> Well, I met some people who needed 'full controll' so they wnated to
> code in assembly.
I do that too, if it's necessary (and on a 8 MHz 286 with 128K RAM it
is). No big deal. You know, I even make less mistakes then than I
usually make in C.
> > And I
> > still need less than 2 MB while a Java VM is still loading its 100
> > MB foot print into memory. ;->
>
> Java is a different story, but GC overhead is not that bad in some
> better languages (2-3x).
Yeah, right.
And with a "class NoHeapRealtimeThread" you even circumvent garbage
collection completely.
Of course, stuff like Lisp is perfect. Problem is, nobody wants to use
it, either.
> And performance seems to be unaffected:
>
> http://shootout.alioth.debian.org/benchmark.php?test=all&lang=fpascal
>&lang2=ocaml&sort=fullcpu
>
> Here comes Pascal (FPC) vs Ocaml (hybrid functional/imperative
> language with OO). Ocaml is significantly faster although it havily
> uses GC.
So what? The only proof I see is that the optimizer can be done better.
> Ocaml code is significantly shorter too (they can be coded
> in significanly lesser amount of lines).
Hmm. Funny thing is, if you compare Ada95 and Free Pascal, the test
claims that Ada95 uses much less lines of code either. And considering
that
|function Foo (const I : Integer) : Integer;
|begin
| ...
|end;
in Ada looks
|function Foo (I : in Integer) return Integer is
|begin
| ...
|end Foo;
which is the same amount if you count in LOC (and even more verbose),
either the Pascal code is unnecessarily complex or something else was
wrong. It just doesn't correlate with the praxis.
Vinzent.
--
public key: http://www.t-domaingrabbing.ch/publickey.asc
More information about the fpc-devel
mailing list