I have a situation where I have users saving files from MS Windows to
my Solaris machine. The problem is that when I try to write UNIX
shell scripts to operate on those files that wierd characters such as
blanks get in my way. So, I wrote a wee shell script to rename blanks
with underscores like this:
# Script: fixspace
# Author: Jack Shostak
# Purpose: This script removes all blank spaces from unix filenames.
# Given a path, this script recursively runs down through the
# tree renaming filenames with blanks in them to filenames w/
# "_". So, a file called f i l e.txt would become
# This also works on directory names under the root folder.
# To run: Simply copy the script to a directory and type
# source fixspace path
# where path is a valid UNIX path
# load each directory into 'd'
foreach d (`find $1 -type d -print`)
# load each file info 'f' - directories included
foreach f ("`find $d -type d -print`" "`find $d -type f -print`")
# if the file contains blank, translate it to underscore via move
if ("$f" =~ *" "*) then
mv "$f" `echo "$f" | tr ' ' '_' `
I have two problems with the above:
1) Microsoft Word files have a * habbit of leaving dollar signs in
work files which get scattered about. The above script gags on dollar
signs since it sees them as a variable.
2) I just ran across a case where the find statement in the second
foreach clause above returns a "too many words" error. Perhaps there
are too many files returned for foreach? I didn't know there was a
If anyone can help me resolve either of the above or can point me to a
good script that will clean up MS Windows file naming irregularities I
would be very appreciative.