untar large number of files

untar large number of files

Post by Pat Batema » Mon, 03 Apr 2000 05:00:00



I'm trying to untar a large number of files. I already tried the for command with
for i in `ls`;do tar -xvf $i;done but it doesn't work. I also tried
for i in `ls -l | awk {print $9}`;do tar -xvf $i;done but none seems to work.
These commands work on my linux system, but not on the other where i have a shell account.
Thanks in advance.
 
 
 

untar large number of files

Post by John Hobso » Mon, 03 Apr 2000 05:00:00



>I'm trying to untar a large number of files. I already tried the for
command with
>for i in `ls`;do tar -xvf $i;done but it doesn't work. I also tried
>for i in `ls -l | awk {print $9}`;do tar -xvf $i;done but none seems to
work.
>These commands work on my linux system, but not on the other where i have a

shell account.

Have you tried simply

tar -xvf whatever_your_tar_file_is_called

The commands you give are appropriate for tarring files (with -i, not -x),
not untarring them.

John Hobson

 
 
 

untar large number of files

Post by Dave Wotto » Fri, 07 Apr 2000 04:00:00



>I'm trying to untar a large number of files. I already tried the for command with
>for i in `ls`;do tar -xvf $i;done but it doesn't work. I also tried
>for i in `ls -l | awk {print $9}`;do tar -xvf $i;done but none seems to work.
>These commands work on my linux system, but not on the other where i have a shell account.

What do you mean by "it doesn't work"? Do you get an error message? If so
it would help if you tell us what it is.

Contrary to what John says, these commands look fine to me (except you've
probably missed off some quotes around {print $9}). I'm assuming that
what you mean is that you're trying to unpack a large number of tar files,
not that you're trying to unpack a large number of files from a single
tar file. But this syntax is for Bourne shell (and related shells):
Perhaps the system you're running them on defaults to csh or something
similar. Try switching to the Bourne, Korn or Bash shells before
executing your command. (eg. execute one of the commands: sh, ksh or bash
first).

My only other thought is that neither of your attempts will work if
you've got a VERY large number of files to process, as the command
line will expand too long (in most shells). If that's the case, try:

   ls | awk '{print "tar -xvf " $1 }' | sh

(Actually, as a thought, that will also solve any problem about what
shell you're using as the syntax is correct for both sh and csh)

Dave.
--
There's no need to reply to this posting by email, but if you do,
remove the "nospam" from my email address first.

 
 
 

1. Library management tools for large number of large data files.

Hello, we have an application where a large number of large (10-150M)
data files are to be kept track of and checked in and out by users.
Does anyone know of a program commercial or otherwise that can do
this. The way it is looking now the total size of files may be tens of
gigabytes and hence may have to be archived on tape. Which means
a program that can keep track of files on tape as well as disk will
be helpful. Any suggestions are welcome. Thank you

-- Bala  Guthy

2. unix domain socket: connection refused

3. How to gunzip and untar a large file

4. Sound-Problems with Hauppauge Win/TV / bttv / xtvscreen

5. Looking for automate ftp tranfers of large numbers of files

6. shared memory allocation in Solaris 2.4

7. How do I add something to a certain line in large number of files?

8. I hate spam

9. partition a large file into a number of small ones

10. Setting up large number of accounts from file

11. uploading a large number of files goes wrong

12. NTFS and EXT3 on large number of files

13. searching large numbers of files