> I'm not technical so bear with me.
> We are using WebTrends to analyze site traffic. We are particularly
> interested in logging spider activity. We've altered the "robots.txt" file
> to allow in all spiders. But when I run WebTrends against the Apache log
> file, it's not recording spider activity.
Well, probably its rules do not match any of the spiders visiting your
Quote:> So...does anyone know if you can/must specify certain "transactions" to be
> logged, or does Apache not discriminate and log all requests??
Apache does not discriminate any requests. I.e. every request will be
logged, and any discrimination must be done by the logfile evaluating
tool. Distinguishing between robot and other UA requests is not that
easy - while renowned search engines will have unique user agent
identifiers against which to match, rogue robots often attempt to mimic
popular browsers identifiers, so that they can only be discovered by
identifying unusual request patterns like high successive hit count,
frequency or excessive total hit count coming from a single IP. Standard
logfile evaluation software usually won't do anything useful in spotting
robots beyond listing the total hit count by address and user agent.