> : I'm tried to use wget, but it works very strange for me.
> : I've read the man page, and as far as I know I have to give the
> : command like this (example):
> : wget -r -l0
> : ftp://ftp.germany.eu.net/pub/os/Linux/Distributions/Slackware/slakwar...
> : to get the data from this directory and from all its subdirs. But
> : it really does not. It get only the files (not dirs) from this
> : directory and than (very strange) it downloads files only of all the
> : upper directories, but not recures into any subdir.
> : When reached the top level (in the example) ftp://ftp.germany.eu.net/
> : than it starts to download all files in this directory. This is
> : really not, what I wanted.
> : Anybody knows what goes wrong here?
> Or you can try something like this..works fine for me.. :) Edit some file
> with your text editor, and put the URL in it...then simply do this...
> wget -o log.log -i file.txt -t 0 -r -l <number> -c &
> where file.txt is the filename where you put your requested URL, and
> <number> is the level of recursion... 1 will dl only one level in depth from
> the requested URL, 2 will do 2 levels...0 will take all of it.. :) log.log
> is the file where you can see how much is dowloaded, and -c & will continue
> after the broken line.. :) see ya, hope this helps...
Did you try the "--no-parent" option ? It's in the manpage...
I've found that using wget on FTP sites recursively through a proxy
produces strange results (caused by the proxy), which generates
HTML listings of directories. This confuses wget and it saves the
result as a file instead of creating a directory of the same name;
if the (local) directory already exists, it creates "index.html"
files which weren't on the original site !
Internet/WWW Solutions, Application Services Europe,
Hewlett-Packard GmbH, Boeblingen, Germany