Quote:> the first 180 files i cleaned with
> :%s/^M$//
> in vi
> as i'm an absolute beginner trying to install some
> php-service i seem to stumble over the problem of
> removing these dos-linefeeds.
How are you getting these files? Most popular transfer methods, such
as FTP and kermit, when used properly will take care of this at
transfer time. That is, the real answer to your question is probably
to go back to the previous step and fix that.
However, there are several ways to take care of it now. Rather than
opening each file in vi, you could do the same thing from the command
line in sed
sed 's/^M$//' file
although I think you might want to look for both CRLF and LFCR
sed 's/^M$//;s/^^M//' file
Alternatively, you may have a program called 'dos2unix' or 'fromdos'
on your machine
fromdos < file
And, of course, you could do it in Perl
perl -0pe 's/(\r\n|\n\r)/\n/g' file
Each of the above would read the one file and write to stdout. To
create a new file, you could redirect the output to a file. To do a
whole directory, you could write a little loop
for file in dir; do
fromdos < $file > $file.new
done
and then move the file.new files back to file. You could do this
either on the command line or in a shell script. Or you could use
Perl's "in place" feature to do that for you
for file in dir; do
perl -0pi -e 's/(\r\n|\n\r)/\n/g' $file
done
It doesn't really do it in place. You can see that with something like
for file in dir; do
perl -0pi.bak -e 's/(\r\n|\n\r)/\n/g' $file
done
to save the old files in file.bak. Of course, if you're going to use
Perl, then you can have it do the loop for you too
perl -0pi -e 's/(\r\n|\n\r)/\n/g' dir/*
If there are too many files in your directory for file globbing, then
use xargs
man xargs
to send the files one at a time to whichever solution you choose. Or
just have Perl read the directory for you also.
I hope this helps,
Tim