Greetings!
I'm looking for an app that will recursively download all the
subsequent links and their contents and files for a given link.
I understand WGET might be able to do this, but from what I've
read on freshmeat, I'm not so sure.
What I need the program to be able to do ultimately, is for eg.,
let's say I specify a link such as this:
http://some.edu/c201/lectures.html
and in this link, there are
several other links one of which is a link to an actual directory
and not another URL or an webpage, such as:
http://some.edu/c201/ass2/
which contains various textfiles, ps files, source files and etc.
So, along with other links, I need an app which can recognize
contents of an actual directory and not a webpage and download and
save all the contents of it.
Can someone please recommend an app which will do this for
me? TIA.
Best,
BR
To Reply In Private, Please Remove: ON, MAPS, and .invalid