7.1 Time Complexity If a problem is decidable or solvable, does that mean you can implement a solution using a real computer and get useful results? not always, the solution may require too much time or memory Is the language A decidable? A = { 0^k 1^k | k >= 0 } Yes, the Turing Machine M1 decides A. M1 = "On input string w: 1. Scan the tape; reject if a 0 is found to the right of a 1. 2. Repeat line 3 while both 0s and 1s remain on the tape. 3. Scan the tape; cross off a single 0 and a single 1. 4. If no 0s and no 1s remain on the tape, accept, otherwise reject." How much time does M1 need for a string of length n? How much time does M1 need if the string is made up of all 0s? What's worst-case analysis? find the longest time over all possible length-n inputs What's average-case analysis? find the average time over all possible length-n inputs What's the 'Time Complexity' of a Turing Machine? given a deterministic TM M that halts on all inputs the time complexity of M is function f:N -> N f(n) is the maximum number of steps M uses on any length-n input Big-Oh Does n*n + 2n really give the exact number of steps used by M1? Do you really need to know the exact number of steps used by a TM? Is it possible to translate steps into amounts of real time? Could you build a machine that does more in each step and uses fewer steps for the same algorithm? What do you conclude about constant factors in the time-complexity function? When is running time important, for small inputs or for large inputs? How does the function n^2 + n compare to the function n^2 when n is large? When n is 10000, what is n^2 + n? When n is 10000, what is n^2? What do you conclude about lower-order terms in the time-complexity function? Given two functions f and g, what does it mean to say f(n) = O(g(n))? positive integers c and n0 exist such that for every integer n >= n0 f(n) <= c * g(n) When f(n) = O(g(n)), how is g(n) an 'Upper Bound' for f(n)? f is no larger than g, when ignoring constant factors (think of O as a hidden constant) What's the Big-Oh for the function f1(n) = 8n^3 + 4n^2 + 2n + 1? Is f1(n) = O(n^8)? Is f1(n) = O(n^3)? Is f1(n) = O(n)? How do you find the Big-Oh for any polynomial? keep only the high order term (drop low order terms) drop the constant What's the Big-Oh for the machine M1? Big-Oh and Logarithms What's log[2](1000)? What's log[10](1000)? What's log[2](1000000)? What's log[10](1000000)? What's log[2](1000)/log[10](1000)? What's log[2](1000000)/log[10](1000000)? What's log[b](n) in terms of log[2](n)? log[b](n) = log2(n) / log2(b) log2(b) is a constant factor What do you conclude about the effect of the base of the logarithm on the Big-Oh? the base changes the result by a constant factor you can ignore the base of the logarithm in Big-Oh's What's the Big-Oh for the function f2(n) = 4*log2(n) + 2*log2(log2(n)) + 1 Which statements are true and which statements are false? n = O(n) log n = O(n) n = O(log n) n^10 = O(2^n) 2^n = O(n^10) n log n = O(n) n = O(n log n) How do you simplify an expression that contains Big-Oh's? f(n) = O(n) + O(log n) + O(n^3) What happens when you have a Big-Oh in an exponent? f(n) = 2^O(n) upper bound of 2^cn for some constant c What happens when you have a log in a Big-Oh in an exponent? f(n) = 2^O(log n) 2^O(log n) is a bound of n^c for some c (same as n^O(1)) (from 2^(log2(n)) = n it follows that 2^(c*log2(n)) = n^c) What's a Polynomial Bound? form is n^c for c > 0 What's an Exponential Bound? form is 2^(n^d) for real number d > 0 Given two functions f and g, what does it mean to say that f(n) = o(g(n))? for any real number c > 0, (must work for any c) a number n0 exists, (note that n0 can depend on c) where f(n) < c * g(n), for all n >= n0 (like Big-Oh but strictly less than) Which statements are true and which statements are false? n = o(n) log n = o(n) n = o(log n) n^10 = o(2^n) 2^n = o(n^10) n log n = o(n) n = o(n log n)