Quiet
  • HOME
  • ARCHIVE
  • CATEGORIES
  • TAGS
  • LINKS
  • ABOUT

Tesla

  • HOME
  • ARCHIVE
  • CATEGORIES
  • TAGS
  • LINKS
  • ABOUT
Quiet主题
  • Study
  • CS170
  • Algorithm

Master Theorem

Tesla
Study Berkeley Notes CS

2024-09-03 23:46:45

Introduction

Suppose a recursive function T(n)T(n)T(n) is defined as follows: T(n)=aT(n/b)+f(n)T(n) = aT(n/b) + f(n)T(n)=aT(n/b)+f(n), where a≥1a \geq 1a≥1 and b>1b > 1b>1 are constants and f(n)f(n)f(n) is a given function. It means we divide the problem into aaa subproblems, each of size n/bn/bn/b, and the cost of dividing and combining the subproblems is f(n)f(n)f(n).

Theorem

If T(n)=aT(n/b)+O(nd)T(n) = aT(n/b) + O(n^d)T(n)=aT(n/b)+O(nd), then:

$$ T(n)=\left\{ \begin{aligned} &O(n^{log_ba}) & \text{if } a>b^d \\ &O(n^dlog n) & \text{if } a=b^d \\ &O(n^d)& \text{if } aPreliminary

Proof

After reconsideration, I found that the initial proof is just kidding. Here is a detailed proof: https://www.luogu.com.cn/article/w3avh1ku

Notation

  • OOO: Big O notation, which is used to describe the upper bound of a function. Typically, when x→∞x\rightarrow\inftyx→∞, f(x)≤M⋅g(x)f(x)\leq M\cdot g(x)f(x)≤M⋅g(x) (M is a constant).
  • Θ\ThetaΘ: Theta notation, which is used to describe the tight bound of a function.
  • Ω\OmegaΩ: Omega notation, which is used to describe the lower bound of a function.

We can draw a recursion tree to analyze the time complexity of the recursive function:

Number Time
1 f(n)f(n)f(n)
aaa f(n/b)f(n/b)f(n/b)
a2a^2a2 f(n/b2)f(n/b^2)f(n/b2)
⋯\cdots⋯ ⋯\cdots⋯
alogbna^{log_b n}alogb​n f(1)f(1)f(1)

If f(n)=ndf(n)=n^df(n)=nd, we get $$\begin{aligned}T(n) =& f(n) + af(\frac{n}{b}) + a^2f(\frac{n}{b^2}) + \cdots + a^{log_b n}f(1)\ =&f(n)(1+\frac{a}{b^d}+\frac{a^2}{b^{2d}}+\cdots+\frac{a^{log_b n}}{b^{dlog_b n}})\end{aligned}$$

Suppose q=abdq=\frac{a}{b^d}q=bda​, then T(n)=f(n)(1+q+q2+⋯+qlogbn)T(n)=f(n)(1+q+q^2+\cdots+q^{log_b n})T(n)=f(n)(1+q+q2+⋯+qlogb​n). We can use the geometric series formula to simplify the equation: $$T(n)=f(n)\frac{1-q^{log_b n+1}}{1-q}$$

  • q>1q>1q>1: Noticing that qlogbn=nlogbqq^{log_b n}=n^{log_b q}qlogb​n=nlogb​q, so T(n)≈f(n)nlogbq=f(n)nd⇒T(n)=Θ(nd)T(n)\approx f(n)n^{log_b q}=f(n)n^d\Rightarrow T(n)=\Theta(n^d)T(n)≈f(n)nlogb​q=f(n)nd⇒T(n)=Θ(nd).
  • q=1q=1q=1: T(n)=f(n)logbn⇒T(n)=Θ(ndlogn)T(n)=f(n)log_b n\Rightarrow T(n)=\Theta(n^dlog n)T(n)=f(n)logb​n⇒T(n)=Θ(ndlogn).
  • q<1q<1q<1: T(n)=f(n)1−qlogbn+11−q≤f(n)1−q⇒T(n)=Θ(f(n))T(n)=f(n)\frac{1-q^{log_b n+1}}{1-q}\leq\frac{f(n)}{1-q}\Rightarrow T(n)=\Theta(f(n))T(n)=f(n)1−q1−qlogb​n+1​≤1−qf(n)​⇒T(n)=Θ(f(n)).

Application

  • Large number addition: T(n)=3T(n/2)+O(n)T(n)=3T(n/2)+O(n)T(n)=3T(n/2)+O(n) (We can divide each number into two parts and use three registers to calculate the following additions), a=3a=3a=3, b=2b=2b=2, f(n)=O(n)f(n)=O(n)f(n)=O(n), a>bda>b^da>bd, T(n)=Θ(nlog23)T(n)=\Theta(n^{log_2 3})T(n)=Θ(nlog2​3).
  • Merge sort: T(n)=2T(n/2)+O(n)T(n)=2T(n/2)+O(n)T(n)=2T(n/2)+O(n), a=2a=2a=2, b=2b=2b=2, f(n)=O(n)f(n)=O(n)f(n)=O(n), a=bda=b^da=bd, T(n)=Θ(nlog⁡n)T(n)=\Theta(n\log n)T(n)=Θ(nlogn).
上一篇

Vim

下一篇

Silicon Valley

©2025 By Tesla
Quiet主题