Wallclock Time Question

I am testing two versions of our application, one built with .Net 1.1 and the other on .Net 3.5.

The testing is isolated on a development workstation and all external devices; [SQL Server, etc.] are the same.

When comparing wall-clock time for the top level process in a given call stack I am do not understand the results I am seeing.

Application A shows a time of 250.764 and application B shows 229.300.

In this scenario I would expect that application B would complete the task quicker than application A.

When run outside the profiler application A completes the task in 70% of the time taken by application B.

Any help in explaining what I am misunderstanding appreciated.

Comments

  • You can only do accurate *absolute* timings when the profiler is in low-overhead mode. This discounts using line-level profiling. Can you repeat the experiment with sampling mode? Also, wall-clock times are going to be heavily influenced by external latencies. Try CPU timing first - again if its comparing *absolute* time you are interested in.
  • You can only do accurate *absolute* timings when the profiler is in low-overhead mode. This discounts using line-level profiling. Can you repeat the experiment with sampling mode? Also, wall-clock times are going to be heavily influenced by external latencies. Try CPU timing first - again if its comparing *absolute* time you are interested in.

    Thank you for the information: I'll try it with those settings and get back to you if I am still confused.
Sign In or Register to comment.