> Hello,
> I've been trying to compress a large directory structure using gzip, and
> have tried the following commands:
> gzip -r directory_name > zipped_file.gz
> gzip -rc directory_name > zipped_file.gz
> Neither one of these seems to work. I believe the problem is that there are
> symbolic links within the directory tree that I am trying to compress that
> link to other directories in the same tree. The program seems to get stuck
> with this. Is there a flag for gzip that makes it ignore symbolic links? If
> not, how else would I zip a directory tree like this.
> Thanks in advance for the help,
> Adam
Okay, first of all, gzip ignores symbolic links all by itself, so you have no
worries there.
Second, gzip does not write to stdout by default, so
gzip -r directory_name > zipped_file.gz
does absolutely nothing. If you want to write to stdout, you want "gzip -c"
Third, gzip -r individually zips every file it acts on into its own .gz
$ mkdir /test
$ cd /test
$ touch a b c d
$ ls
a b c d
$ gzip -r /test
$ ls
a.gz b.gz c.gz d.gz
I don't know if this is the behavior you want... If you want one big .gz file
with everything in it, either cat the files together first:
$ cat a b c d | gzip -c > bigfile.gz
Or, (this is way better), use tar to collect all your files and directory
structure, uncompressed, into a single .tar file. Then you gzip that. This
is what is known as a tarball, and is a typical method of distributing unix
software.
Here we go:
$ tar -cf - (directory to compress) | gzip -f > (filename.tar.gz)
Oh, and btw, tar also ignores symlinks, so no worries there.
Hope this helps,
Ryan.