[fpc-devel] ref,user and prog in chm format (updated)

Marco van de Voort marcov at stack.nl
Thu Jul 30 21:02:51 CEST 2009

In our previous episode, Graeme Geldenhuys said:
> > Depacking a html zip on a slow machine takes over tens of minutes already.
> > 7zip would make it worse.
> Hey, it's quicker than compiling FPC. ;-)

Well, that's why people install releases, so they don't have to :-)
> > The whole idea is to get rid of the depacked files all together, and a
> > solid archive is useless as help format.
> True, but I thought I would mention that I used solid archive mode
> (optional of course) simply so that it could be compared to .tar.gz which
> is also a solid archive.

Yes. But the unix case is more difficult even, since there are no binary
installers there. So there it will have to wait till the relevant
distro's/OSes have 7zip in base.

The whole point of .tar.gz (and the shellscript installer) is that it is
always available.
> > Maybe. Just provide a FPC implementation of the depacker for use in the
> > various installers, and we can test with it.
> I remember looking into this about a year ago. I used Total Commander with
> the 7-zip plugin, which is written in Delphi. And yes, it's a native
> pascal implementation, not a wrapper to the 7-zip .dll's.

Wouldn't hurt. But mostly worthwhile for windows/dos. Less for *nix and OS X
(Jonas, correct me if I'm wrong)
> > .tar.gz's are solid in a way too, so that is no surpise in this case. 
> Not everybody has fast internet connections and others like me, only have
> a certain amount of bandwidth available in 30 days. So I would take the
> extra 3 minutes unpacking the archive any day over the size (11MB vs 1.9MB
> is a huge difference). But yes, I get what you are saying about getting
> rid on unpacked help altogether.

My P233 (that I use for win98/dos compat) did 40minutes to unpack the html
docs, and then 10+ minutes to index. Moreover, it is the system with the
smallest HD, so the CHM helps there too. And the CHM archive is more 5MB.

The problem, as always in the size discussions, is that you can't be
onedimensional and focus on size alone. Then it will just hurt the people
with the embedded distro's, heaps of problems mount on the ackward OSes
(read: Dos). 

And of course people actually seeing such change all the way through after
the initial suggestion is another main problem. Most people only want to
"fix" their own usecase, and forget about the rest. Think about stuff like:

- the in-pascal decompressor.
- fixing the textmode installer
- fixing all makefiles and related stuff. Also in the other repositories.
   ( fpcdocs, fpcbuild)
- testbuilding releases. (as I'm now doing for chm)
- being release manager for the release that introduces it, and fixing the
   inevitable fallout.

Btw, if size is really a problem, consider generating the docs yourself. SVN
has low overhead after the first time, and if you have a half decent linux
distro (e.g. from a magazine) it is quite doable. Actually, that's how I got
involved in docs building (in the nineties with the dreaded latex2html) in
the first place. I assume GIT is not that much worse than SVN in this
regard. ( :-) )

More information about the fpc-devel mailing list