I use good old pen and paper :)
A hardworking idiot is much worse than an idiot. (C) Unknown.
I use good old pen and paper :)
A hardworking idiot is much worse than an idiot. (C) Unknown.
Seems to me that the easiest way to compare an algorithm complexity of two different algorithms, is to count the number of operations (with no regard to their weight) required to process an input of a million records or more (one record never works out...). This would be the more academic approach, since it really doesnt work if one algorithm which seems less complex uses a subroutine which would increase the actual complexity by a significant magnitude, but for academic purposes it's just fine. If you're looking for a more engineering approach, you could time the processing time of the same million records input with the different algorithms, and compare the times. The same approach works for timing operations. Though there will be some deviation resulting from the other operations in the timing commands, the deviation should stay constant (providing the machine is otherwise not processing any other tasks in parallel, and that the same machine is used for all testing).