'Stopwatch to time an async/await method inaccurate
I'm writing some performance tests, and want to be able to time a method that is asynchronous. The code looks like this, where action is a Func<Task<HttpResponseMessage>>:
var sw = new Stopwatch();
HttpResponseMessage response = null;
sw.Start();
response = await action().ConfigureAwait(continueOnCapturedContext: false);
sw.Stop();
The code compiles and runs ok, but the measured milliseconds are about 100 times higher than the request timings we see in Fiddler - Fiddler reports 200-300ms, but the stop watch reports ~30,000ms. Is there some catch about timing asynchronous methods? Is the solution to do the timing in the action itself (which would be annoying?)
Solution 1:[1]
This should work fine for measuring the true amount of time it takes for your asynchronous task to finish. You need to keep in mind that:
- the time you are measuring with Fiddler is measuring the request only, and none of the time it takes for your code to process the response.
- there is a significant difference in time here and you should easily be able to time your code yourself to see how long it takes to progress from a break point before your request to after your request. If this is closer to 30 seconds then your stop watch is probably accurate.
- there isn't anything obvious to me that would make your time inaccurate.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 |
