[fpc-devel] LinkLib Issues In Lazarus and in FPC-2.0.2
Jonas Maebe
jonas.maebe at elis.ugent.be
Tue Jul 18 01:22:40 CEST 2006
On 17 Jul 2006, at 22:36, Marco van de Voort wrote:
>> Of course not. You should never change a {$linklib magicstring} into
>> some command. The exact command(s) to be executed would have to be
>> embedded in the unit with a compiler directive (and probably, just
>> like library name, be ifdef'd per platform).
>
> The setup of the whole linkaliasing is to change the $linklib into
> mere
> identifiers (while compiling units) that can then be mapped at will
> to real
> libnames when determining how to link, with the default being 1:1.
I know, but I was trying to point out that e.g. on Mac OS X sometimes
you may need
-lgtk-1.2.0 -lgdk-1.2.0 -lgdk_pixbuf
and sometimes
-framework Gtk+-Cocoa
and sometimes
-framework Gtk-Carbon
(and there's a fourth "native" Mac OS X gtk port underway, because
the two previous ones were never really finished)
You need something more than merely the remapping of library names.
> This
> means we could move the whole ifdef {$linklib mess to a platforms (or
> distro) specific fpc.cfg #include and more importantly OUT OF SORUCE
> (regardless of internal or external, though with external more of the
> discovery process must be done up front).
>
> I want more out of source.
I don't really care where exactly the information is stored (although
the unit somehow seems logical to me, if you want to keep the
interface and libs specification together), but if we change things
from the way they are now, then the information should be dynamic
(i.e., not merely determined at install time, because that only
shifts the problem) and preferably come from available automatic
sources (as opposed to the developer having to specify it).
>>> Also, fact is we do have an internal linker and it will hopefully be
>>> used for most platforms in the future.
>>
>> I do not see this happening any time soon for e.g. Mac OS X.
>
> I didn't see it happening at all half an year ago, but thanks to
> Peter...
The GNU tool chain is at the same time more and less static than the
Mac OS X tool chain. Apple changes internal stuff concerning
libraries and the like all the time for their new whizzbang features
(zero-link, fix-and-continue, prebinding and then not anymore, ...).
It would be hell to keep up with that.
>> That is not my point, and I'm not
>> against providing this functionality. As I said in my reply to Marco,
>> I consider it complementary, as a last resort if all else fails.
>
> I consider pkgconfig very important. But I want to have it feed a
> store, a
> FPC configuration file.
The problem is that it needs to be "refed" every time you compile
something using particular units (those depending on libraries
requiring this mumbo-jumbo). Not to mention on multi-arch systems
(like darwin/i386 which can also run darwin/ppc binaries) where the
parameters will depend on the architecture you are compiling for
(unless you have perfectly symmetrical needs for ppc and i386 with
just a different directory name somewhere, but that's often not the
case because ppc is most of the time supported back to Mac OS X
10.2.8, while i386 support only starts at 10.4).
> This because I want to be able to say "mail me your compiler
> state", and
> send back a new one. Too much automatism will make things less
> transparant.
> To the user, but also to the one who has to support it.
That's what "-al -s" is for.
> Supporting multiple versions is hard, and other people designs are not
> always perfect or tailored to our use. Do I have to mention XCode
> here? :-)
I do not think a closed source IDE which is being (re)built from
scratch and being stuffed with new features all the time (as opposed
to first fleshing out the basics) to give Steve Jobs something flashy
to talk about at the next WWDC is comparable to a script which
generates some linker parameters. And a bunch of linker parameters is
exactly tailored to what we need.
>>> that: if we want to be a more professional tool than C, we should
>>> not
>>> copy the defficiencies. Makefiles are one of them, Autohell is
>>> another,
>>> and I consider this one such a thing as well.
>>
>> I think all this "competing with C" and not-invented-here syndrome is
>> downright silly, along with all the claiming that most bad things
>> come from C.
>
> That was Almindor not me.
I was replying to a mail from Daniel.
> That is not "not-invented-here" syndrome or C phobia, but a simple
> reality.
> I agree a bit with Florian that there is a fundamental problem in
> the Unix
> (and then specially the Open Source parts) way of doing here.
That's more or less what I'm saying below, which you are refuting.
>> If anything, it comes mostly from the standard Unix
>> design philosophy (small, well-delineated tools which perform their
>> core function and *only* that core function)
>
> _eighties_ Unix philosophy. Have you seen the sheer size of an avg
> Linux
> distribution?
Many small things (and some larger, like OpenOffice.org) together
make for large sizes.
> gtk-config? Seem to do awfully lot,
It prints a string with linker parameters.
> and call scripts from the compiler
> binary, selecting one by some magic way
In the same magic way as we select the external assembler and linker.
> and parse it and try to merge it
> with our own state and support that?
For the external linker not a single bit of internal state merging is
necessary. It's just a fire-and-forget string. For the internal
linker, yes, you need some kind of parsing just like you need it for
the assembler reader and the binary writer (like Daniel mentioned iirc).
> I'm sorry, but you accuse us of C phobia, but this all sounds a
> like a leap
> of faith that only a slashdot linux newbie would make, not a seasoned
> decade old OSS developer.
I think you are overstating the problem of using the output of gtk-
config.
Jonas
More information about the fpc-devel
mailing list