The most famous and important unsolved problem in Computer Science is: Does P equal NP? Unsolved Problem in Mathematics: Goldbach Conjecture Is every even number greater than 2 the sum of two primes? 4 = 2+2; 6 = 3+3; 8 = 3+5; 10 = 3+7; 12 = 5+7; 14 = 7+7;... 100 = 3+97 and so forth. Fermat's Last Theorem: Can you find positive integers a,b,c,n such that a^n + b^n = c^n for n > 2. 3^2 + 4^2 = 5^2. But can you sum cubes that way? Or fourth powers for way? Or really any power greater than 2 that way? The answer is no. The Collatz Conjecture. Start with a positive integer. If it's odd, multiply it by 3 and add 1. If it's even, divide it by 2 and keep going until you get to 1. 7 --> 22 --> 11 --> 34 --> 17 --> 52 --> 26 --> 13 --> 40 --> 20 --> 10 --> 5 --> 16 --> 8 --> 4 --> 2 --> 1. The Collatz Conjecture: You will always get to 1, eventually. Twin Prime: Are there an infinite number of primes that differ by two? 5-7 11-13 41-43 101-103, 17-19. A computable function is a function for which there is a program (or Turing machine) that can compute it. An algorithm is a systematic description of HOW that function can be computed. A program is one way to describe an algorithm. Pseudo-code is another way. Most people, however, describe algorithms in English. Another consideration is the model of computation we are using. One way to describe algorithms is with a Turing Machine. No sane person does it that way. The most common model of computation is the "RAM model." We imagine that we have as much as we need--we will never run out of memory. We also assume that integers can be as large as we need (we never have integer overflow). Any attempt to allocate space (i.e. a new or a malloc command) will always work. The reasons we like the RAM model are that they more closely mirror actualy compters AND we get better estimates of time complexity (efficiency.) For example, we say that sorting is inherently an n lg n algorithm...but you ain't gonna get on a Turing Machine. We also imagine that arithmetic is a constant-time operation. Multiplying two 100-digit numbers is the same time as multiplying two 1-digit numbers. We tend to evaluate algorithms on two criteria: time complexity and space complexity. Time complexity (efficiency) is how many steps the algorithm takes. Space complexity is how much memory is allocated during the process. You can take this model to extremes. Mean, Median, Mode: You have an array of integers, how do you compute mean, median, and mode? Mean: add 'em all up and divide by the array size. Linear time: O(n). Median: Sort the array and take the middle element. O (n lg n). There is a linear time way to do this. It's really clever, and I hope we can get into it. Mode: Count the amount of times each element appears and takes the largest. O(n^2). Sort the array first, then find the longest chain of consecutive identical integers. O(n lg n). There is a way to do this in O(n) time. It is insane and makes use of the fact that you have as much RAM as you need. We can analyze average-case time and worst-case time. We usually analyze worst-case time because it's easier. Quick Sort. Pick a random element from the array (called the pivot). Divide the array into a set of elements less than or equal to the pivot, and a set of elements greater than or equal to the pivot. Recursively sort these portions of the array. Now, on average this sort takes n lg n time. But in the worst case it takes n^2.