The pdh interface to performance monitor let you measure three
process-related attributes to get the CPU usage of a process:
What confuses me is that all three of these are given as a *percentage of
elapsed time*. How can such a value be meaningful in an absolute way? Some
process may take 2 seconds of CPU time, but elapsed time is always a
variable. On a heavily loaded system the elapsed time might be 20 seconds.
On an unloaded system elapsed time might be five seconds. So the pdh
counter is going to give processor time for this process as 10% in the first
case and 40% in the second case, yet the actual amount of CPU time was a
constant for the two cases.
Is there some more reliable way to represent the CPU usage of a process that
would give me a good feel for how much of a machine's resources that process
takes up, expressed as some normalized value?
NOTE: To reply, CHANGE the username to westes AT uscsw.com