[fpc-devel] type discussion
JeLlyFish.software at gmx.net
Fri Jun 3 18:05:22 CEST 2005
On Friday 03 June 2005 14:52, Sebastian Kaliszewski wrote:
> Vinzent Hoefler wrote:
> >>If you didn't notice I wrote "C nothwithstanding". C is far away
> >> from modern high level languages (and also many older than it but
> >> hight level languages).
> > It's not much worse than C++ in typical projects (which has, I
> > admit, much to do with the people using it).
> Of course C is much worse than contemporary C++ (C++ + std::).
Only in theory. Look at typical projects in the industry.
> >>>You must have read different studies. :)
> >>Certainly. Those I read were about mainsteream software development
> >>not niche life critical realitime systems.
> > You know that about 90% of all installed computer(chip)s are
> > actually embedded systems? So I'd consider this "mainstream
> > software" a niche. :-)
> Count separate apps not separate instalations.
Doesn't change the count much, I think. Every f*cking microwave oven has
its own control program these days.
> >>The study was measuring
> >>error rate in some typical and rather simple prgramming task coded
> >> in varius languages by varius programmers. Correlation found was
> >> clear -- number of bugs per 1kloc were constatant
> > What figures did they give? I'm curious.
> I don't remember. I read the paper over a year ago.
I'm just asking because somebody from a car company claimed that one bug
per 1000 SLOC would be "the normal expected amount" and I got a study
here that claims a bug rate of less then a tenth (0.096 defects/KSLOC
[and even for their C-Code, they had 0.676 defects/KSLOC]) for "just a
compiler and related tools", not for potentially life-critical systems
as you'd expect in the 70 or more microprocessors a modern car has
Which seems quite surprising considering that the bug rate should be
practically independent of the language (ok, this statement is not
valid for C99, of course, it's the only exception there). Sarcasm
> >>and independent of the
> >>language. The difference was that the same task could be coded in
> >>some languages using significantly lesser number of lines.
> > Which is quite irrelevant once you've got to maintain it.
> Smaller programs are easier to maitain.
Yes, but small programs are not the typical project. Even small projects
break through the 100 KSLOC barrier very quick.
> >>Two things:
> >>1. They'd better claim they got significant gains or someone should
> >>be fired for wasting money. IOW I take all such corporation made
> >>studies not with a grain but with a shuffle of salt.
> > Even the salt doesn't taste away the fact that they were on-time,
> > on-budget
> This could also be that they simply properly planed for the unknown.
Of course they did, they are not bloody id10ts and it was not their
first project of this size. It was just the first that was much cheaper
than ever expected.
> You can do most of apps withing time & bugget even in pure machine
> code if you'll assume huge enough budget and long enough time.
Considering that it wasn't government money they had to waste, instead
they took the risk of the whole development theirselves, this certainly
wasn't the case here.
> >>2. You're talking about complex life-critical realtime system, not
> >>mainstream software.
> > Yes, true. Funny thing is, that those companies are now almost able
> > to deliver software cheaper than the mainstream. If they would cut
> > the testing completely, they definitely could and the software
> > wouldn't even be as worse. (BTW, that's my own observation, not
> > some study.)
> The same functionality?
Yes. MULTOS/CA had 0.04 defects per KSLOC with an average "productivity"
of 28 lines of code per developer per day, if I remember the figures
> >>In such system the cost of bugs is many orders
> >>of magnitude higher than in mainstream. Cost structure of such
> >>projectes differs stronly from mainstream programming.
> > Yes. The main costs goes into testing. Which, in main stream
> > software, is done by the customer and he even pays for the bugfix
> > called upgrade. So, of course, testing is a bad thing. You can earn
> > less money with it and it even costs you.
> > Is that what you're trying to tell me?
> No. Simply preventing some failure is economically viable if
> cost_of_failure * probality_of_filure > const_of_prevention.
Yes. But Cost_Of_Failure sometimes seems to be a negative amount while
Cost_Of_Prevention never is.
> >>>>C++ (pre stl) was worst there, Python, Perl & likes were the
> >>>> best, Java was not that bad but still about 30% worse thatn
> >>>> Python (C++ was 100% i.e 2x worse).
> >>>What did they actually measure?
> > I'd guess, in that test, languages like Pascal or Ada would get an
> > even worse rating than C.
> Nope. There was enough time to develop the app in all but some
> ackward joke languages. Problem was simple & well understood.
Yeah. The latter may be the problem. Most real world problems are
neither simple nor they are well understood just until after you solved
> > Because these are languages that focus on design
> > and doing it right instead of focusing on doing it fast.
> All languages require design.
But they encourage it in different scales.
> >>Besides lack of
> >>convincing argumentation about compiler correctness,
> > Realtime doesn't necessarily mean correctness.
> You were talking about flight control system...
We were mainly talking about FPC. I'm crazy, but I'm damn sure not crazy
enough to recommend FPC for such kind of system.
> > A while ago I talked to
> > one of the developers working for a electronic company who wrote
> > the code for a television. He told me, that in the final system,
> > there would be about 140 running threads. That's definitely not
> > small scale and judging on how often I had to reboot my Nokia phone
> > by removing the batteries, it doesn't mean correctness either.
> And this is example of what?
The example of a typical commercial, industrial (embedded) project?
> For me it's just sloppy programming.
Exactly. The point is: some languages just make it harder to do so and
some languages make it even harder *not* to do so.
> Lot of that Nokia stuff isn't realtime either (it maybe should, but
> it's not).
Lot's of the code in a realtime environment is not real time. The 80/20
rule probably applies to almost everything. ;-)
> >>you need real
> >>time guarantees wrt allocations and other library stuff (which FPC
> >>does not provide).
> > No. Simple rule: Don't allocate at runtime.
> If you have fixed amount of data the fine. But if not then it's
> rather hard.
Yes, it is. OTOH I know the memory map out of my head. :)
> >>Java is a different story, but GC overhead is not that bad in some
> >>better languages (2-3x).
> > Yeah, right.
> That's the reality. Just look at properly designed languages, not
I'm looking at the stuff that gets used most often, because it must be
superior then, otherwise nobody would use it. But i.e. Python's
automatic memory system didn't convince me any bit more, although it
seems more effective to me.
> > And with a "class NoHeapRealtimeThread" you even circumvent garbage
> > collection completely.
> > Of course, stuff like Lisp is perfect. Problem is, nobody wants to
> > use it, either.
> It's as true as nobody wants to use Ada... IOW both are used in
> niches, but both *are* used.
Yes, as Pascal is. Used by me, for instance. In a commercial industrial
application. And I *do* have less bugs than my fellow programmer who
does Python (although I'd like to see it differently, because of the
dynamic typing) and although the amount of code I maintain is about
five times as big.
The productivity gain he gets from Python is that he can fix the bugs
faster because he doesn't need to compile anything. Just change the
> >>And performance seems to be unaffected:
> >>al &lang2=ocaml&sort=fullcpu
> >>Here comes Pascal (FPC) vs Ocaml (hybrid functional/imperative
> >>language with OO). Ocaml is significantly faster although it havily
> >>uses GC.
> > So what? The only proof I see is that the optimizer can be done
> > better.
> See comparison with C. They're more or less equal performance wise.
doesn't look much *equal* to me. Maybe, my feeling for numbers is
> >>Ocaml code is significantly shorter too (they can be coded
> >>in significanly lesser amount of lines).
> > Hmm. Funny thing is, if you compare Ada95 and Free Pascal, the test
> > claims that Ada95 uses much less lines of code either. And
> > considering that
> > |function Foo (const I : Integer) : Integer;
> > |begin
> > | ...
> > |end;
> > in Ada looks
> > |function Foo (I : in Integer) return Integer is
> > |begin
> > | ...
> > |end Foo;
> But what is that '...' is what matters.
Yes, of course. But we were talking about verbosity in general and not
on the amount of standard libraries.
> Probably Pascal examples are poor (it was already signaled on this
> list). But don't forget that Ada is more verbose but is also somewhat
> more expressive (compare the typesystems for example).
Especially the type system accounts for a lot of writing work if you
want to do it "the right way".
For an example, take a look at
It's really nasty when you just consider the writing work. :)
> expressive languages allow for more concise programs.
To be honest: I wouldn't expect much difference between Ada and
(Free)Pascal programs in terms of lines of code there. But I'd expect a
difference in terms of declaration lines and in FreePascal a much
lesser amount of needed code because of the relatively large amount of
libraries you can use (which usually don't account for SLOC).
public key: http://www.t-domaingrabbing.ch/publickey.asc
More information about the fpc-devel