Before diving into these methods we suggest you to learn and be familiar with what complexity and running time of an algorithm are. We recommend you to read our previous article about Space and Time Complexity.

Substitution method

The first method we are going to present is the substitution method. This method comprises two steps:

  1. Guess the form of the solution
  2. Use mathematical induction to find the constraints and show that the solution works

We can use the substitution method to establish either upper or lower bounds on a recurrence. Once we found the recurrence of our algorithm we make a guess about what its bound could be and then we try to prove our hypothesis with this method.

As an example, let us determine an upper bound on the recurrence

T(n) = 2T([n / 2]) + n

 We guess the solution is  T(n) = Ο(n lg n).

 

We want to prove that T(n) ≤ cn lg n (remember, it is an upper bound, each value should be less-or-equal that our guessed solution for an appropriate choice of the constant c > 0).  We start by assuming that this bound holds for all positive m < n, in particular for m = [n / 2]:

T(n) ≤ 2(c [n / 2] lg([n / 2])) + n
     ≤ cn lg(n / 2) + n
     = cn lg(n) - cn lg(2) + n
     = cn lg(n) - cn + n
     ≤ cn lg(n)

 where the last step holds as long as c ≥ 1.

 

Mathematical induction now requires us to show that our solution holds for the boundary conditions. With T(1) = 1, we derive from the recurrence that T(2) = 4 and T(3) = 5. Now in order to complete the inductive proof we just need to choose the value of c large enough so that T(2) = c2 lg 2 and T(3) = c3 lg 3. As it turns out, any choice of c ≥ 2 suffices for the base cases of n = 2 and n = 3 to hold. For most of the recurrences we shall examine, it is straightforward to extend boundary conditions to make the inductive assumption work for small n, and we shall not always explicitly work out the details.

Make a good guess

The main problem with the substitution method is that we need to make a guess that works in order to find the right solution of our recurrence. The main idea is to try to recognize and associate a new recurrence solution with a similar one we solved previously. Usually, we can assume that the same solution holds for a slightly different recurrence with nearly the same values.

Possible mistakes

  • We always have to make sure that our solution works for any value we assign to the constant c. If we find our solution off only by a constant value, we overcome our difficulty by subtracting a lower-order term from our previous guess. (T(n) ≤ cn becomes T(n) ≤ cn - d).
  • We always have to prove the exact form of the inductive hypothesis, if we cannot find this exact form we can't say we have proven our hypothesis.
  • Sometimes we might be tempted to try a larger guess. Although we can make this larger guess work, we usually find that there could be a better lower guess as a solution.

Recursion tree

In a recursion tree, each node represents the cost of a single subproblem somewhere in the set of recursive function invocations. We sum the costs within each level of the tree to obtain a set of per-level costs, and then we sum all the per-level costs to determine the total cost of all levels of the recursion.

https://www.cs.cornell.edu/

We determine the cost at each level of the tree, then we add up the costs over all levels to determine the cost for the entire tree. In such way, we can determine a good guess that we need to confirm using the substitution method explained above.

Master method

Through the master method we are able to directly solve recurrences of the form

T(n) = aT(n / b) + ƒ(b)

 where a ≥ 1 and b > 1 are constant and ƒ(n) is an asymptotically positive function.

We need to memorize three simple cases that will provide us with a direct solution for such recurrences.

  1. If ƒ(n) = Ο(nlogba-ε) for some constant ε > 0, then T(n) = Θ(nlogba)
  2. If ƒ(n) = Θ(nlogba), then T(n) = Θ(nlogblg n)
  3. If ƒ(n) = Ω(nlogba+ε) for some constant ε > 0, and if a ƒ(n / b) ≤ c ƒ(n) for some constant c < 1 and all sufficiently large n, then T(n) = Θ(ƒ(n))

Examples

T(n) = 8T(n / 2) + 30n2
a = 8, b = 2, ƒ(n) = 30n2
ƒ(n) = Ο(nc), with c = 2 (n2)
T(n) = Θ(nlogba) = Θ(n3)



T(n) = 2T(n / 2) + 6n
a = 2, b = 2, ƒ(n) = 6n
ƒ(n) = Θ(nlog n), with c = 1 (n)


logba = log22 = 1 = c
T(n) = Θ(nlog n) = 
Θ(n log n)



T(n) = 2T(n / 2) + n2
a = 2, b = 2, ƒ(n) = n2
ƒ(n) = Ω(nc), with c = 2 (n2)


logba = log22 = 1 < c
s(n2 / 4) < kn2, choosing k = 1 / 2

T(n) = Θ(ƒ(n)) = Θ(n2)