Jump to content
Sign in to follow this  
hsauro

Measure Process Time

Recommended Posts

I am measuring the performance of some Delphi code using TStopwatch and I'm seeing considerable variation in the computed times. I run a method 10,000 times and record the time. I then do this 200 times (all within the same program) from which I get an average time of 3.79 seconds for the 10,000 runs with a standard deviation of 0.55.

 

That seems to be quite a large variability. My question is, does TStopwatch measure the total time elapsed or just the time spent on my particular process? If it measures the elapsed time is it possible to measure the amount of time specifically used to execute my code?

 

Herbert

Share this post


Link to post

Doesn't seem like a massive amount of variability. Probably depends on what your code does.

 

You are measuring what is known as wall clock time. Total elapsed time. Why would you want to measure anything else? Wall clock time is the time as perceived by the users. 

Share this post


Link to post

And then there's also memory refresh and other system interrupts that will happen inbetween your program. If you have a multi-processor machine, what you could also do is set processor affinity to force the thread(s) to run on a single core, to minimize interruptions like this.

Share this post


Link to post
1 hour ago, David Heffernan said:

Doesn't seem like a massive amount of variability. Probably depends on what your code does.

 

You are measuring what is known as wall clock time. Total elapsed time. Why would you want to measure anything else? Wall clock time is the time as perceived by the users. 

My concern was if the time measured is all CPU activity them comparing different implementations of my code run at different times might give unreliable estimates because there could be a different amount of background activity in the two runs. I presume this is the origin of most of the variability I am seeing. I have noticed there is an initial startup component to the variability (cache vs hard drive I assume) and that repeated runs after the first few runs stabilize the times to be less variable. I've noticed that timings I see reported byothers often have no variability estimate on them.

 

Herbert

Share this post


Link to post
1 hour ago, stijnsanders said:

And then there's also memory refresh and other system interrupts that will happen inbetween your program. If you have a multi-processor machine, what you could also do is set processor affinity to force the thread(s) to run on a single core, to minimize interruptions like this.

Exactly that's why I was asking if there was a way to measure the actual amount of time spent in my process and not dealing with everything else which is unpredictable. 

Share this post


Link to post

Out of interest, I thought I'd show a scatter plot of the mean and standard deviation from 100 independent runs. Each data point was obtained by measuring how long it took to run a particular method and repeating this 10,000 times and taking the mean and sd for that data. Each set of 10,000 measurements was an independent run started from a batch file. The application itself was run 100 times from the batch file which yields the 100 data points in the plots below. It is interesting to note some outliers around the 35 and 90 iteration. I never touched the computer during this run so this could be some internal housekeeping going on. The standard deviation has quite a bit of variability but ignoring the outliners the mean is fairly consistent at 3.5. Having seen this data I have more confidence in the mean.  Y axis is time and x axis is the run number. The code is doing a lot of memory allocation and deallocation of small blocks of memory (16 bytes at a time) and a lot of simple calculations (but no I/O).

 

image.png.0778b279c2ec3df6e7826f19503924c0.png

 

Out of interest, the following graphs were obtained where all 100 runs were done in the same application. (no batch file) What I thought was interesting is the decline in the standard deviation as the program progressed. Not sure what the blip at 20 or 70 is. 

 

image.png.d68366a5a5ac64582d78b5b4c3b5c6f4.png

 

Herbert

Share this post


Link to post
8 hours ago, hsauro said:

Exactly that's why I was asking if there was a way to measure the actual amount of time spent in my process and not dealing with everything else which is unpredictable. 

Who says that the variation isn't happening in your process?

Share this post


Link to post

What is your code doing ?

Have you checked with a very simple, expectd to be constant, function (like cos(), etc.) ?

Share this post


Link to post

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×