> Hi,
> When trying to delete about 1000 of files with the "rm *.*" command from a
> csh script I get the error message "Argument
> list too long". I guess the shell is trying to make a huge list of the
> filenames and then trying to delete each file. Does anybody
> have a solution for this? The filenames mostly consists of random numbers so
> a solution with, e.g, "rm *153.*" is no good.
Someone has already mentioned find with or without xargs, I am sure.
You can also use a brute force method:
# This solves the problem when there are 100 times too many files
foreach i ( {1,2,3,4,5,6,7,8,9,0}{1,2,3,4,5,6,7,8,9,0} )
rm *$i.*
end
or
# This solves the problem when there are 1000 times too many files
foreach i ( {1,2,3,4,5,6,7,8,9,0}{1,2,3,4,5,6,7,8,9,0}{1,2,3,4,5,6,7,8,9,0} )
rm *$i.*
end
You might need a "set nonomatch"
and to use "/bin/rm -f" instead of rm (if you have "rm" alias to "rm -i")
--
Bruce <barnett at crd. ge. com> (speaking as myself, and not a GE employee)