Several months ago, I breifly summarize the ways to solve recurrence relations. At the end of that post, I indicate that different types of recurrence relation may require different kinds of treatments to solve them. Thus, this post will be the first "Downloadable Content (DLC)" with the aim to solve the recurrence relation: \(T(N) = 2T(N/2) + N\).
This recurrence relation comes from merge sort and the algorithm itself represents a classic divideandconquer strategy: in order to sort \(N\) elements, we can sort \(N/2\) elements first (i.e., divide the problem into smaller problems and solve recursively), and then we merge two sorted \(N/2\) elements back into one \(N\) sorted array (i.e., we patch toghter the answer in conquer phase.)
The exactly recurrence relation we try to solve is the following with assumption that \(N\) is a power of 2:
There are two ways to solve this recurrence relation:
Method 1: Construct a telescoping sum
The goal of this method is to construct a telescoping sum (i.e see telescope series to get a sense of telescoping) with the aim to find a relation between \(T(N)\) and \(T(1)\) (or the base cases, in general).
Let's work through our example above to demonstrate this method. We divide the recurrence relation through by \(N\) and repeatively doing so for every possible \(N\) (i.e. \(N, N/2, N/4, \dots, 2, 1\)) and see what we can get:
We add up all the equations: we add all of the terms on the lefthand side and set the result equal to the sum of all of the terms on the righthand side. This leads to a telescoping sum: all the terms that appear on both sides get cancelled. For example, the term \(T(N/2)/(N/2)\) appears on both sides and thus cancels. After everything is added, the final result is:
because all of the other terms cancel and there are \(\log N\) equations, and so all the \(1\)s at the end of these equations add up to \(\log N\).
Note
for this recurrence relation, it is necessary to divide through \(N\) in order to get telescoping sum. However, how to construct telescoping sum is case by case. For instance, for a recurrence relation \(NT(N) = (N+1)T(N1) + 2cN\), we need to divide \(N(N+1)\). For a recurrence relation \(T(N) = T(N1) + cN\) ^{1}, we don't need to do any division. We just need to use the recurrence relation repeatively for different \(N\) to construct the telescoping sum (i.e. \(T(N1) = T(N2) + c(N1)\), \(T(N2) = T(N3) + c(N2)\), and so on.)
Method 2: Iteratively substitute
For this method, we continuely substitute the recurrence relation on the righthand side with the hope to find a pattern of the general solution to the recurrence relation.
We have
Then, we substitute the second equation back into the first equation's righthand side and we get:
Now, we can substitute \(N/4\) into the main equation, we see that
We can continuing this substitution, and if we observe the \ref{eqn:1} and \ref{eqn:2} we can obtain the following pattern:
using \(k = \log N\), we obtain

This recurrence relation is acutally a linear nonhomogeneous recurrence relation with constant coefficients. However, it cannot be solved by the method I write up in the last post. I have no clue why. This recurrence relation is taken from MAW p243. ↩