[fpc-pascal] Division by Zero - not raised exception

Marco van de Voort marcov at stack.nl
Mon Apr 17 17:19:41 CEST 2006


> > Lazarus perform optimally with say 512-1024MB memory, and not to try to
> > squeeze it into sub 128 MB. That would be counterproductive.
> 
> Inside total phisical memory of 128MB, Delphi compililation is very fast.

Not in a modern version. It won't even start probably.

The point however is, those delphi versions stem from times where memory was
scarse, and a lot of effort went into optimizing memory usage.

If we would try to match that, uncompromisingly, with Free Pascal, we would
have to put nearly everything else on hold. That is not sane.

> That mean that process of compilation is optimized to work awith available
> phisical memory (at least under 128MB). The same is case with command line
> compiler (dcc32).  I doubt it is contra-productive, on the contrary. Any
> attempt to work with swapfile lead to enormous performance decreasing,
> which is currently case.

The point is that 128MB simply is an arbitrary border. And IMHO not
realistic in this day and age.


> During building Lazarus, peak of used memory by comiler is over 85MB. As
> well, 192MB of minimum reserved virtual memory by Windows is often too
> small. All indicate that optimization of the compiling and internal linking
> process should be one of the top priority.

Yes, but for maximal throughput on a normal system and memory amount. Not
some arbitrary antique one that Delphi4 or 5 had to deal with.
 
> In my opinion, FPC and Lazarus can try to ratioinally use available
> resources

There you hit the core of the problem: You can't reliably detect the
available _physical_ memory under most OSes. So you have to just take a
CPU/mem point and optimize for that, and the user must make that available
or live with the consequences.

> not to force developers or company to buy new hardware. 

That is one way to say it. The other way is just buy new memory to avoid FPC
developers have to spend disproportionate amounts of time to keep the
software running with very old systems. 

Keep also in mind when the branch where this development is done reaches the
ordinary release-using users, we are at least one, maybe two years further.
How many of those old machines will still be running then?

> Note that many companies still work with relatively old hardware and they
> are not willing for further investments (often mean replacement old
> hardware with new).

Companies that can't afford 256MB extra in a developer machine, are
effectively already bankrupt.  Even outside the USA and Western Europe. 

Note that for Visual Studio and Delphi (current versions), 2GB is considered
normal.



More information about the fpc-pascal mailing list