Running WebTrends against log file

Running WebTrends against log file

Post by Roadrunne » Wed, 23 Sep 1998 04:00:00



I'm not technical so bear with me.

We are using WebTrends to analyze site traffic. We are particularly
interested in logging spider activity. We've altered the "robots.txt" file
to allow in all spiders. But when I run WebTrends against the Apache log
file, it's not recording spider activity.

So...does anyone know if you can/must specify certain "transactions" to be
logged, or does Apache not discriminate and log all requests??

TIA,
Roadrunner

 
 
 

Running WebTrends against log file

Post by Sevo Still » Wed, 23 Sep 1998 04:00:00



> I'm not technical so bear with me.

> We are using WebTrends to analyze site traffic. We are particularly
> interested in logging spider activity. We've altered the "robots.txt" file
> to allow in all spiders. But when I run WebTrends against the Apache log
> file, it's not recording spider activity.

Well, probably its rules do not match any of the spiders visiting your
site.

Quote:> So...does anyone know if you can/must specify certain "transactions" to be
> logged, or does Apache not discriminate and log all requests??

Apache does not discriminate any requests. I.e. every request will be
logged, and any discrimination must be done by the logfile evaluating
tool. Distinguishing between robot and other UA requests is not that
easy - while renowned search engines will have unique user agent
identifiers against which to match, rogue robots often attempt to mimic
popular browsers identifiers, so that they can only be discovered by
identifying unusual request patterns like high successive hit count,
frequency or excessive total hit count coming from a single IP. Standard
logfile evaluation software usually won't do anything useful in spotting
robots beyond listing the total hit count by address and user agent.    

Sevo

--
Sevo Stille


 
 
 

Running WebTrends against log file

Post by Suza.. » Thu, 24 Sep 1998 04:00:00



> I'm not technical so bear with me.

> We are using WebTrends to analyze site traffic. We are particularly
> interested in logging spider activity. We've altered the "robots.txt" file
> to allow in all spiders. But when I run WebTrends against the Apache log
> file, it's not recording spider activity.

Does your Apache set up use the standard 3-logs style for storing its
data? If so, you'll need to analyze the referrer log rather than your
access log. If you have your Apache set up to log everything into one
log make sure its NCSA compliant extended log format and everything
should work fine.

Suzanne Baylor
WebTrends Corporation
http://www.webtrends.com/redirect/ng.htm

Sales and Info (503)294-7025 or 1-888-WebTrends

 
 
 

1. WebTrends Enterprise Suite vs. WebTrends Enterprise Reporting Server?

At work, I currently use WebTrends Enterprise Suite (3.5, I think)
on a Windows NT box for web log analysis and reporting.  It produces
pretty pictures and charts that executive sorts like, and does a
pretty good job of it.

Unfortunately, it also manages to crash randomly, as does the NT box
as a whole.  Having better things to do with my time than babysit the
two of them, re-run missed reports, and such, I've been considering
trying to move off the NT box onto a Solaris or Linux box.

I've noticed that WebTrends Enterprise Reporting Server is available
for Solaris and Linux, and costs the same as Enterprise Suite.  But
on WebTrends' web site, I have been unable to find a head-to-head
comparison of the two products.  Enterprise Suite's brochure says it's
the "most comprehensive" or some such, while Enterprise Reporting
Server's says it's the "most powerful."

I don't care about the marketing fluff; I want a comparison of specs,
features and such. :)  So I'll be pestering WebTrends to see if they
can cough one up.

In the meantime, I'm curious as to whether anyone out there in Usenet
land has actually used both of these products, or moved from one to
the other (either way).  If you have, I'd be interested in hearing
your thoughts, opinions and experiences.

I'm sending followups to c.i.w.s.unix, since I'm hoping to wind up on
the UNIX side of things.

-Dan

--
Dan Birchall - Palolo Valley - Hawaii - http://dan.scream.org/
Spam is for musubi, not e-mail.  Pass Hawaii Senate Bill 2352!
http://www.capitol.hawaii.gov/sessioncurrent/bills/sb2352_.htm
My address expires.  Take out the hex stamp if replies bounce.

2. ext3 umount oops in 2.5.2-pre10

3. webtrends/log questions

4. Bootup diskless Client

5. How 2 log apache cookie 4 webtrends?

6. Digitale I/O - TTL PIA ( Parallel Interface Adapter ) with linux driver

7. WebTrends Professional Suite - The Log Analyzer

8. Cdrecord and PARIDE...

9. Log Analyzer -- Webtrends

10. New WebTrends Log Analyzer, Proxy Analyzer & Link Analyzer is now available

11. how to run cmp of files against CD

12. How can I split a log file into several log files?

13. wierd stuffs in proxy log file and back end server log file