MERGE SORT Sort (A) : if size(A) <= 1 DONE Sort (First Half of A) Sort (Second Half of A) Merge (First Half, Second Half) The merge is going to take THETA (n). If the running time for Merge Sort is T(n) (n is the size of the array). We can see that T(n) = 2 T(n/2) + THETA (n). T(n) = THETA (n lg n). Master Theorem or the Master Method. The BOOK is a lot more rigorous and detailed. Suppose, the running time of an algorithm can be expressed by a recurrence: T(n) = a T (n/b) + f(n). Take a look at n^(log_b (a)), and look at f(n). Whichever is slower, is the running time of the algorithm. If they are the same, it's that thing times lg n. Merge Sort: T(n) = 2 T (n/2) + n. n^(lg_2(2)) = n^1 = n. They are the same, so the running time is n lg n. If I look at Binary Search: T(n) = T (n/2) + 1. n^(lg_2 (1)) = n^0 = 1. They are the same, so the running time is lg n. T(n) = 2T (n/4) + n^2. n^log_4(2) n^2 When you raise one complex number to a complex power: n^e, the answer is a set, rather than a number. If n is zero, the answer is the empty set. If e is an integer, the set is of cardinality one. If e is rational, and the reduced fraction for e is p/q, the set is of cardinality q. If e is irrational, or nonreal, the set is countably infinite. What is i^i? There are infinite number of answers to this. (All of them are real.) Cube root of 8, is 2, and everyone agrees with this. Cube root of -8 is controversial. Many would say -2 is the proper answer, but some would it should be 2 cos pi/3 + 2i sin pi/3. Anyway, since n^2 is greater than n^(1/2), this algorithm runs in THETA (n^2), since it is slower. SORTING Some very famous algorithms for sorting, most of the famous algorithms are not any good. The ones we mostly learn in CS122 suck like nobody's business. So, one of the most famous sorts used in industry is the Quick Sort. The Quick Sort was invented by the mathematician C.A.R. Hoare. Quick Sort (A). If size (A) <= 1 DONE Choose a pivot (one of the elements of n). Separate the list into two smaller lists. Stuff larger (or equal to) n. Stuff smaller (or equal to) n. Quick Sort both lists. This didn't become a popular industry sort until recusion-supporting languages were implemented AND programmers became comfortable using them. I first saw Quick Sort implemented in BASIC, which is not a recursive program. For a while, Shell Sort was more of a standard, since it is still pretty fast and you don't need recursion. Heap Sort was standard for while, since you don't need recursion AND you don't need extra storage space AND it still runs n lg n...but it's a SLOW n lg n. Now, for "reasonable length" arrays, Shell Sort's n lg^2 n, may beat Heap Sort, because it's a fast n lg^2 n, but ultimately if the array is large enough, Heap Sort will run. There are different variations on Quick Sort: 1. You have to choose a pivot. How do you choose that pivot? When I learned Quick Sort, we chose the first element in the array as our pivot. Some people choose a random element to be the pivot. Median-Of-Three technique: choose three elements, and the pivot is the middle of those three. If you are just that lucky, and you pick the exact iddle element of the array every single time, your recurrence: T (n) = 2 T(n/2) + n T(n) = THETA (n lg n). This is if you are super lucky. It takes more math to get this result, but on average, it will also take THETA (n lg n) time. If you choose the world's worst pivot every single time, it will take THETA (n^2) time, if you write it correctly. It's easy bto miswrite the algorithm so that the world's worst pivot will cause the algorithm to run infinitely.