I'm stumped. I may have to switch to perl for this, but I didn't want to.
Originally I had a script that did:
find dir -print | head -50 > file
xargs -r script < file
xargs -r rm < file
Well, of course this has some problem, especially with things like embeded
spaces in the file names.
So, I switched to using the GNU extension -print0 for find and -0 for
xargs. But now, of course, I can't use head to limit the processing to 50
Things need to be batched for the script to be useful (it formats
information for human review, then the rm gets done based upon positive
feedback), so I can't use find -exec.
Using -l50 for xargs gets me part way there, but it still runs multiple
I guess I could use find -print | head and modify script to read the files
from stdin rather than the command line, but I'm wondering if there are
other solutions (outside of adding a --run-maxtimes option to xargs).