[fpc-devel] type discussion
sebaska at melkor.mimuw.edu.pl
Fri Jun 3 16:52:08 CEST 2005
Vinzent Hoefler wrote:
>>If you didn't notice I wrote "C nothwithstanding". C is far away from
>>modern high level languages (and also many older than it but hight
> It's not much worse than C++ in typical projects (which has, I admit,
> much to do with the people using it).
Of course C is much worse than contemporary C++ (C++ + std::).
>>>You must have read different studies. :)
>>Certainly. Those I read were about mainsteream software development
>>not niche life critical realitime systems.
> You know that about 90% of all installed computer(chip)s are actually
> embedded systems? So I'd consider this "mainstream software" a
> niche. :-)
Count separate apps not separate instalations.
>>The study was measuring
>>error rate in some typical and rather simple prgramming task coded in
>>varius languages by varius programmers. Correlation found was clear
>>-- number of bugs per 1kloc were constatant
> What figures did they give? I'm curious.
I don't remember. I read the paper over a year ago.
>>and independent of the
>>language. The difference was that the same task could be coded in
>>some languages using significantly lesser number of lines.
> Which is quite irrelevant once you've got to maintain it.
Smaller programs are easier to maitain.
>>1. They'd better claim they got significant gains or someone should
>>be fired for wasting money. IOW I take all such corporation made
>>studies not with a grain but with a shuffle of salt.
> Even the salt doesn't taste away the fact that they were on-time,
This could also be that they simply properly planed for the unknown. You can
do most of apps withing time & bugget even in pure machine code if you'll
assume huge enough budget and long enough time.
>>2. You're talking about complex life-critical realtime system, not
> Yes, true. Funny thing is, that those companies are now almost able to
> deliver software cheaper than the mainstream. If they would cut the
> testing completely, they definitely could and the software wouldn't
> even be as worse. (BTW, that's my own observation, not some study.)
The same functionality?
>>In such system the cost of bugs is many orders
>>of magnitude higher than in mainstream. Cost structure of such
>>projectes differs stronly from mainstream programming.
> Yes. The main costs goes into testing. Which, in main stream software,
> is done by the customer and he even pays for the bugfix called upgrade.
> So, of course, testing is a bad thing. You can earn less money with it
> and it even costs you.
> Is that what you're trying to tell me?
No. Simply preventing some failure is economically viable if cost_of_failure
* probality_of_filure > const_of_prevention.
In systems you're talking about const of failure is enormous, so you can
spend on prevention till probablity of failure is really, really small.
>>>>C++ (pre stl) was worst there, Python, Perl & likes were the best,
>>>>Java was not that bad but still about 30% worse thatn Python (C++
>>>>was 100% i.e 2x worse).
>>>What did they actually measure?
> I'd guess, in that test, languages like Pascal or Ada would get an even
> worse rating than C.
Nope. There was enough time to develop the app in all but some ackward joke
languages. Problem was simple & well understood.
> Because these are languages that focus on design
> and doing it right instead of focusing on doing it fast.
All languages require design.
> Which, in
> practice, means you get compilabe code much later, but in the end this
> code is more likely to be correct. So it all depends on what you
They measured the number of actual bugs in code declared by the author as
ready to deploy.
>>Besides lack of
>>convincing argumentation about compiler correctness,
> Realtime doesn't necessarily mean correctness.
You were talking about flight control system...
> A while ago I talked to
> one of the developers working for a electronic company who wrote the
> code for a television. He told me, that in the final system, there
> would be about 140 running threads. That's definitely not small scale
> and judging on how often I had to reboot my Nokia phone by removing the
> batteries, it doesn't mean correctness either.
And this is example of what? For me it's just sloppy programming.
Lot of that Nokia stuff isn't realtime either (it maybe should, but it's not).
>>you need real
>>time guarantees wrt allocations and other library stuff (which FPC
>>does not provide).
> No. Simple rule: Don't allocate at runtime.
If you have fixed amount of data the fine. But if not then it's rather hard.
>>Java is a different story, but GC overhead is not that bad in some
>>better languages (2-3x).
> Yeah, right.
That's the reality. Just look at properly designed languages, not Java.
Java is so but because it uses bloated objects for virtually everything.
Want to store few int numbers into a list? You must encapsulate every int
into Integer class object. You want to write some values out into a file --
you have to create a cascade of objects to do that.
> And with a "class NoHeapRealtimeThread" you even circumvent garbage
> collection completely.
> Of course, stuff like Lisp is perfect. Problem is, nobody wants to use
> it, either.
It's as true as nobody wants to use Ada... IOW both are used in niches, but
both *are* used.
>>And performance seems to be unaffected:
>>Here comes Pascal (FPC) vs Ocaml (hybrid functional/imperative
>>language with OO). Ocaml is significantly faster although it havily
> So what? The only proof I see is that the optimizer can be done better.
See comparison with C. They're more or less equal performance wise.
>>Ocaml code is significantly shorter too (they can be coded
>>in significanly lesser amount of lines).
> Hmm. Funny thing is, if you compare Ada95 and Free Pascal, the test
> claims that Ada95 uses much less lines of code either. And considering
> |function Foo (const I : Integer) : Integer;
> | ...
> in Ada looks
> |function Foo (I : in Integer) return Integer is
> | ...
> |end Foo;
But what is that '...' is what matters.
In C you'll also get
int Foo(int i)
> which is the same amount if you count in LOC (and even more verbose),
> either the Pascal code is unnecessarily complex or something else was
> wrong. It just doesn't correlate with the praxis.
Probably Pascal examples are poor (it was already signaled on this list).
But don't forget that Ada is more verbose but is also somewhat more
expressive (compare the typesystems for example). More expressive languages
allow for more concise programs.
More information about the fpc-devel