The master method is a formula for solving recurrence relations of the form:
T(n) = aT(n/b) + f(n), where, n = size of input a = number of subproblems in the recursion n/b = size of each subproblem. All subproblems are assumed to have the same size. f(n) = cost of the work done outside the recursive call, which includes the cost of dividing the problem and cost of merging the solutions
a ≥ 1 and
b > 1 are constants, and f(n) is an asymptotically positive function.
An asymptotically positive function means that for a sufficiently large value of n, we have
f(n) > 0.
Master theorem is used in calculating the time complexity of recurrence relations (divide and conquer algorithms) in a simple and quick way.
a ≥ 1 and
b > 1 are constants and
f(n) is an asymptotically positive function, then the time complexity of a recursive relation is given by
T(n) = aT(n/b) + f(n) where, T(n) has the following asymptotic bounds: 1. If f(n) = O(nlogb a-ϵ), then T(n) = Θ(nlogb a). 2. If f(n) = Θ(nlogb a), then T(n) = Θ(nlogb a *log n). 3. If f(n) = Ω(nlogb a+ϵ), then T(n) = Θ(f(n)). ϵ > 0 is a constant.
Each of the above conditions can be interpreted as:
f(n)will become polynomially smaller than
nlogb a. Thus, the time complexity is oppressed by the cost of the last level ie.
nlogb a. Thus, the time complexity will be
f(n)times the total number of levels ie.
nlogb a* log n
f(n)will become polynomially larger than
nlogb a. Thus, the time complexity is oppressed by the cost of
T(n) = 3T(n/2) + n2 Here, a = 3 n/b = n/2 f(n) = n2 logb a = log2 3 ≈ 1.58 < 2 ie. f(n) < nlogb a+ϵ , where, ϵ is a constant. Case 3 implies here. Thus, T(n) = f(n) = Θ(n2)
The master theorem cannot be used if:
T(n) = sin n
f(n)is not a pilynomial. eg.
f(n) = 2
a = 2
a < 1