gzip / zip / compress : 2 gig limit?

gzip / zip / compress : 2 gig limit?

Post by asbar.. » Fri, 04 Aug 2000 04:00:00



Hi all,

I have a Linux fileserver for my company of ~60 people.  I have about
40 gig of data uncompressed.  I back up to tape, but I'd also like to
be able to tar it (with compression, like tar -cvzf), or zip it, or
something.  Unfortunately all methods die at the 2 gig limit mark.  I
assume this is a limitation of the compression program rather than the
ext2 filesystem isn't it?

I read about somebody compiling gzip to support larger files, but don't
have a clue where to start.  Can anybody help me create compressed
files greater than 2 gig?

Aaron

Sent via Deja.com http://www.deja.com/
Before you buy.

 
 
 

gzip / zip / compress : 2 gig limit?

Post by Johan Kullsta » Fri, 04 Aug 2000 04:00:00



> Hi all,

> I have a Linux fileserver for my company of ~60 people.  I have about
> 40 gig of data uncompressed.  I back up to tape, but I'd also like to
> be able to tar it (with compression, like tar -cvzf), or zip it, or
> something.  Unfortunately all methods die at the 2 gig limit mark.  I
> assume this is a limitation of the compression program rather than the
> ext2 filesystem isn't it?

ext2 on ia32 does not support 2GB files.

i am not sure about the limits on tar.  are you using tar *directly*
to your tape device?  for example:

$ cd /dir/tree/to/back/up
$ tar cvzfb /dev/your-tape-drive 200 .

i usually set the TAPE environment variable to the tape device if i
have one or the floppy /dev/fd0 if i don't (yes, the floppy can be
used like a tape!).  then you don't need any "f" option.

Quote:> I read about somebody compiling gzip to support larger files, but don't
> have a clue where to start.  Can anybody help me create compressed
> files greater than 2 gig?

> Aaron

> Sent via Deja.com http://www.deja.com/
> Before you buy.

--
J o h a n  K u l l s t a m

sysengr

 
 
 

gzip / zip / compress : 2 gig limit?

Post by asbar.. » Fri, 04 Aug 2000 04:00:00


Hi,

No, I was TARring directly to a file on the ext2 filesystem.  I guess
ext2 DOES have a 2 gig limit eh?  So I'm just out of luck!

Aaron

> ext2 on ia32 does not support 2GB files.

> i am not sure about the limits on tar.  are you using tar *directly*
> to your tape device?  for example:

> $ cd /dir/tree/to/back/up
> $ tar cvzfb /dev/your-tape-drive 200 .

> i usually set the TAPE environment variable to the tape device if i
> have one or the floppy /dev/fd0 if i don't (yes, the floppy can be
> used like a tape!).  then you don't need any "f" option.

> --
> J o h a n  K u l l s t a m

> sysengr

Sent via Deja.com http://www.deja.com/
Before you buy.
 
 
 

gzip / zip / compress : 2 gig limit?

Post by Dances With Cro » Sat, 05 Aug 2000 04:00:00



>No, I was TARring directly to a file on the ext2 filesystem.  I guess
>ext2 DOES have a 2 gig limit eh?  So I'm just out of luck!

Well, not quite.  The 2G limit is an artifact of the Intel architecture,
and does not exist on a 64-bit Sparc or Alpha.  ext2 itself is 64-bit by
design, it's just the VFS for Intel that's 32-bit.

Even this has been fixed recently.  RedHat 6.2 shipped with a patched
kernel and a patched glibc that together could handle files > 2G.  The
problem is that not every application takes advantage of this relatively
new feature.  Do a Deja search on this NG for "2G file size limit" and
see what you come up with; someone posted a guide to upgrading
everything a while ago....

--
Matt G|There is no Darkness in Eternity/But only Light too dim for us to see
Brainbench MVP for Linux Admin /   Tyranny is always better organized
http://www.brainbench.com     /    than freedom.
-----------------------------/              --Charles Peguy

 
 
 

gzip / zip / compress : 2 gig limit?

Post by Andreas Kaha » Sat, 05 Aug 2000 04:00:00



>Hi all,

>I have a Linux fileserver for my company of ~60 people.  I have about
>40 gig of data uncompressed.  I back up to tape, but I'd also like to
>be able to tar it (with compression, like tar -cvzf), or zip it, or
>something.  Unfortunately all methods die at the 2 gig limit mark.  I
>assume this is a limitation of the compression program rather than the
>ext2 filesystem isn't it?

>I read about somebody compiling gzip to support larger files, but don't
>have a clue where to start.  Can anybody help me create compressed
>files greater than 2 gig?

>Aaron

>Sent via Deja.com http://www.deja.com/
>Before you buy.

The ext2 file system supports disks up to 4 terabyte, but individual
files can only be 2 gigabytes.

I think ext2 in GNU/Linux 2.2 supports larger files (1 terabyte).

For more info, search for "large file support" at
<URL:http://www.google.com/linux>.

/A

--
# Andreas K?h?ri, <URL:http://hello.to/andkaha/>.
# ...brought to you from Uppsala, Sweden.
# All junk e-mail is reported to the appropriate authorities.
# Criticism, cynicism and irony available free of charge.

 
 
 

gzip / zip / compress : 2 gig limit?

Post by Byron A Je » Sat, 05 Aug 2000 04:00:00



>Hi all,

Hi. I mailed this to Aaron along with posting...

Quote:

>I have a Linux fileserver for my company of ~60 people.  I have about
>40 gig of data uncompressed.  I back up to tape, but I'd also like to
>be able to tar it (with compression, like tar -cvzf), or zip it, or
>something.  Unfortunately all methods die at the 2 gig limit mark.  I
>assume this is a limitation of the compression program rather than the
>ext2 filesystem isn't it?

It's a limitation of the Intel 32 bit Virtual File System for Linux up to
2.4 kernels.

Quote:

>I read about somebody compiling gzip to support larger files, but don't
>have a clue where to start.  Can anybody help me create compressed
>files greater than 2 gig?

Another option. You could simply split the output of your gzip into multiple
files. For example:

tar -czvf - | split -b1800m - archive

would create a series of 1.8 Gig files that contain your archive. To
reconstitute:

cat archive* | tar -xzvf -

This is a temp fix until 2.4 with large file support is in wide use.

BAJ

 
 
 

gzip / zip / compress : 2 gig limit?

Post by asbar.. » Tue, 08 Aug 2000 04:00:00


Now THIS is very cool!  I tried it and it works great, thanks for the
advice!

Aaron



Quote:

> Another option. You could simply split the output of your gzip into
multiple
> files. For example:

> tar -czvf - | split -b1800m - archive

> would create a series of 1.8 Gig files that contain your archive. To
> reconstitute:

> cat archive* | tar -xzvf -

> This is a temp fix until 2.4 with large file support is in wide use.

> BAJ

Sent via Deja.com http://www.deja.com/
Before you buy.
 
 
 

1. How to compress to ZIP instead of gzip?

Hi All,

Hope someone can help me out with this.

I'm running the latest version of Perl on FreeBSD and using
the latest version of Apache. I am using the code below to
successfully compress the txt file file.txt into a gzip file.

use LWP::Simple;



My question is how can I compress the file into a zip file
instead of a gzip file. I searched CPAN but could not find
anything to to that. Is there anyway to modify my code above
to create a .zip file instead of a gzip file?

Thanks and I do hope to hear from someone soon.

Regards,
Alan

Sent via Deja.com http://www.deja.com/
Before you buy.

2. Is zsh y2k compliant?

3. How to compress multiple vol. using gzip/zip

4. Real Time Linux Workshop at UPV

5. gzip compresses its own binary when running from the ksh scripts

6. Tool for checking remote connection

7. gzip/compress compression routine/library want

8. 2.5.1, DNS zone transfers

9. gzip vs. winzip/pkzip vs. compress

10. Q: Compressing backup using dump and gzip

11. gzip and compressed tar

12. Auto-compressing all output when browser says "Accept-Encoding: gzip": Possible?

13. Compressing a directory structure with gzip