A method for comparing different algorithms is "Algorithm Analysis". Basically it is a 5 step process: Step 1. What is the INPUT SIZE of the algorithm? E.G. An array[20] or array[n] Step 2. Identify the/a basic operation that is performed at each step of the algorithm. E.G. For i=0 to n-2 do For j=i+1 to n-1 do if A[i] == A[j] return sin(A[i]); return false; The basic operation could be a comparison, an assignment statement or addition. Step 3. Check to see if the worst case and best case are the same. Will the running time complexity be the same. Time complexities are one of the following: Big-Oh, Big-Omega, or Big-Theta Step 4: Create a mathematical summation for the number of times the algorithm's basic operation is executed. Step 5: Using standard formulas and rules of sum manipulation, either find a closed formula for the count or , at the very least, establish its order of growth. In the example above the basic operation is a comparison. In the worst case there are either no equal elements or an array with the last two elements are the only pair of equal elements. So compute for the worst case. You have two SUMS (note: lack of math notation): 1 from i=0 to n-2 and 2 from j=i+1 to n-1. The two sums are on 1. 1 is the # of times the basic operation is executed. The short is you get a result from your SUM = (n-1)n/2. In the worst case the algorithm needs to compare all n(n-1)/2 distinct pairs of its n elements. This algorithm would have a running time complexity of Big-Theta(n^2). So just give your boss a time complexity: Big-Oh, Big-Omega, or Big-Theta.
R
redcrow
@redcrow