'Why is my running time graph have outliers?
I implemented 3 algorithms, merge sort, insertion sort, and fast multiply. They all work correctly, I made extensive unit tests to confirm that. Now I am testing their running time over a large set of inputs of various sizes, but my graphs have outliers. What could be the causes?
The things I considered:
- My shuffling algorithm (move everything over by one. Then randomly swap 2 elements n times, where n = size of the array. This ensures very few elements remain in the same place and there is a decent amount of variation in each iteration)
- Random hardware stuff/ chance (would like a better explanation)
merge sort running time , fast multiply running time
I am refraining from posting the code since these are assignments and I do not want to get flagged for plagiarism.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
