[fpc-pascal]Writing a file splitter for very large files.
James Mills
prologic at prologitech.com
Tue May 6 17:25:12 CEST 2003
On Tue, May 06, 2003 at 04:51:19PM +0200, A.J. Venter wrote:
> Hi,
>
> I need to write a method to split very large files (8Gb and up) into
> chunks which can later be concatenated back into the orriginal.
>
> The chunks would need to CD-Size (e.g. around 630MB's).
>
> How could I handle this ?
> Does FPC support opening files that size ?
> If I were to open the file, then read and copy through byte by byte,
> would that work ?
I'm not 100% sure on this one, others would probably help you more.
However I"m sure you can simply open the file with a standard file
handle, and then read byte by byte, reading 630Mbytes then writing this
out to another file.
cheers
James
>
> I have some theories, but no conclusive ideas on how to implement this.
> The files in question are uncompressed tarballs. I tried compressing
> them but both bzip2 and gzip choked out. This happens whether I use the
> compressions switches to tar (when creating them) or afterwards on the
> final file.
>
> Any help, greatly appreciated.
> --
> Story of my life: "Semper in excretum, set alta variant"
> A.J. Venter
> DireqLearn Linux Guru
>
> _______________________________________________
> fpc-pascal maillist - fpc-pascal at lists.freepascal.org
> http://lists.freepascal.org/mailman/listinfo/fpc-pascal
More information about the fpc-pascal
mailing list