Quote:> Can anyone make a recommendation for a linux or other free unix based
> Intel system of a freeware http proxy program that has the capability to
> do url filtering. I'd like to setup a proxy server for outbound
> connections at a school and am looking for ways to block certain sites.
> Program names and locations are welcome.
Take a look at Squid and squidGuard.
Squid is an engineering project. It's meant to demonstrate high-performance
caching through a mesh of extremely well-connected servers around the
Internet. At bottom, it's a model to demonstrate application of the Internet
standards RFC2186 (Squid *is* RFC2187). Filtering just isn't its main
purpose. However, there'ss an add-on called "squidGuard" that does exactly
what you want. And it's all freeware, including the updates to the
"blacklist" of dirty sites.
Three Notable Points on Squid:
1. Squid is an engineering project funded by the National Science
Foundation (NSF) and is being developed at the National Lab for Advanced
Network Research. (<http://www.veryComputer.com/>)
2. It's freeware, intended mainly for UNIX. (If you're running RH, it's
probably already installed on your box....)
3. It's a direct decendent from the Harvest server model.
Three Notable Points on squidGuard:
1. Produced by a Norwegian telephone/ISP company.
2. A "blacklist" of sites is updated regularly and is available for free via
FTP. So automating the filter update is trivial. The list is free; there's
no subscription fee. However, it's also rather thin; any kid with a bit of
imagination isn't going to be stunned by the stock list.
3. Uses Berkeley DBM, so there's no dramatic performance penalty when using
it with huge filtering lists.
Set it up, try it out. Resource-wise, Squid wants an increasing amount of
disk space as well as RAM and CPU as usage goes up.
ESiewick'a'DigiPro.com DigiPro Digital Productions, LLC
Voice: 703-522-8465 3100 North Quincy Street
Fax: 703-522-8417 Arlington, *ia 22207