> > Does anyone know of a script I can use that will email a notification
> > whenever the contents change on any given URL? I know there are some
> > "URL Minder" services freely available, but we're behind a corporate
> > firewall, so they won't work. I'd like to run a local version.
> > Ideally, this should recurse any filesystems/directories without
> > having to explicitly set it up for individual URLs.
> I don't know of any, but it should be pretty easy to gin one up.
> Use wget or 'lynx -dump' or similar to grab the page(s) and save them.
> Use wget or whatever to grab the pages again and save them to a different
> location.
> Use diff to compare the pages, and if they're different, send email.
> It's a little trickier if you've got multiple pages, so you have to
> recurse through them, but not much.
If you don't want to keep local copies of the files, download each
site with 'wget -r', and store the date of last modification of each
file in a table, then erase the download.
To compare, repeat the process and diff the two tables.
Replace the old table with the new one.
--
Chris F.A. Johnson http://cfaj.freeshell.org
===================================================================
My code (if any) in this post is copyright 2001, Chris F.A. Johnson
and may be copied under the terms of the GNU General Public License