31-3 Three algorithms for Fibonacci numbers

This problem compares the efficiency of three methods for computing the nnth Fibonacci number FnF_n, given nn. Assume that the cost of adding, subtracting, or multiplying two numbers is O(1)O(1), independent of the size of the numbers.

a. Show that the running time of the straightforward recursive method for computing FnF_n based on recurrence (3.22)\text{(3.22)} is exponential in nn. (See, for example, the FIB procedure on page 775.)

b. Show how to compute FnF_n in O(n)O(n) time using memoization.

c. Show how to compute FnF_n in O(lgn)O(\lg n) time using only integer addition and multiplication. (Hint:\textit{Hint:} Consider the matrix

(0111) \begin{pmatrix} 0 & 1 \\ 1 & 1 \end{pmatrix}

and its powers.)

d. Assume now that adding two β\beta-bit numbers takes Θ(β)\Theta(\beta) time and that multiplying two β\beta-bit numbers takes Θ(β2)\Theta(\beta^2) time. What is the running time of these three methods under this more reasonable cost measure for the elementary arithmetic operations?

a. In order to solve FIB(n)\text{FIB}(n), we need to compute FIB(n1)\text{FIB}(n - 1) and FIB(n1)\text{FIB}(n - 1). Therefore we have the recurrence

T(n)=T(n1)+T(n2)+Θ(1).T(n) = T(n - 1) + T(n - 2) + \Theta(1).

We can get the upper bound of Fibonacci as O(2n)O(2^n), but this is not the tight upper bound.

The Fibonacci recurrence is defined as

F(n)=F(n1)+F(n2).F(n) = F(n - 1) + F(n - 2).

The characteristic equation for this function will be

x2=x+1x2x1=0. \begin{aligned} x^2 & = x + 1 \\ x^2 - x - 1 & = 0. \end{aligned}

We can get the roots by quadratic formula: x=1±52x = \frac{1 \pm \sqrt 5}{2}.

We know the solution of a linear recursive function is given as

F(n)=α1n+α2n=(1+52)n+(152)n, \begin{aligned} F(n) & = \alpha_1^n + \alpha_2^n \\ & = \bigg(\frac{1 + \sqrt 5}{2}\bigg)^n + \bigg(\frac{1 - \sqrt 5}{2}\bigg)^n, \end{aligned}

where α1\alpha_1 and α2\alpha_2 are the roots of the characteristic equation.

Since both T(n)T(n) and F(n)F(n) are representing the same thing, they are asymptotically the same.

Hence,

T(n)=(1+52)n+(152)n=(1+52)nO(1.618)n. \begin{aligned} T(n) & = \bigg(\frac{1 + \sqrt 5}{2}\bigg)^n + \bigg(\frac{1 - \sqrt 5}{2}\bigg)^n \\ & = \bigg(\frac{1 + \sqrt 5}{2}\bigg)^n \\ & \approx O(1.618)^n. \end{aligned}

b. This is same as 15.1-5.

c. Assume that all integer multiplications and additions can be done in O(1)O(1). First, we want to show that

(0111)k=(Fk1FkFkFk+1). \begin{pmatrix} 0 & 1 \\ 1 & 1 \end{pmatrix}^k = \begin{pmatrix} F_{k - 1} & F_k \\ F_k & F_{k + 1} \end{pmatrix} .

By induction,

(0111)k+1=(0111)(0111)k=(0111)(Fk1FkFkFk+1)k=(FkFk+1Fk1+FkFk+Fk+1)=(FkFk+1Fk+1Fk+2). \begin{aligned} \begin{pmatrix} 0 & 1 \\ 1 & 1 \end{pmatrix}^{k + 1} & = \begin{pmatrix} 0 & 1 \\ 1 & 1 \end{pmatrix} \begin{pmatrix} 0 & 1 \\ 1 & 1 \end{pmatrix}^k \\ & = \begin{pmatrix} 0 & 1 \\ 1 & 1 \end{pmatrix} \begin{pmatrix} F_{k - 1} & F_k \\ F_k & F_{k + 1} \end{pmatrix}^k \\ & = \begin{pmatrix} F_k & F_{k + 1} \\ F_{k - 1} + F_k & F_k + F_{k + 1} \end{pmatrix} \\ & = \begin{pmatrix} F_k & F_{k + 1} \\ F_{k + 1} & F_{k + 2} \end{pmatrix} . \end{aligned}

We show that we can compute the given matrix to the power n1n - 1 in time O(lgn)O(\lg n), and the bottom right entry is FnF_n.

We should note that by 8 multiplications and 4 additions, we can multiply any two 22 by 22 matrices, that means matrix multiplications can be done in constant time. Thus we only need to bound the number of those in our algorithm.

It takes O(lgn)O(\lg n) to run the algorithm MATRIX-POW(A,n1)\text{MATRIX-POW}(A, n - 1) becasue we half the value of nn in each step, and within each step, we perform a constant amount of calculation.

The recurrence is

T(n)=T(n/2)+Θ(1).T(n) = T(n / 2) + \Theta(1).

MATRIX-POW(A, n)
    if n % 2 == 1
        return A * MATRIX-POW(A^2, (n - 1) / 2)
    return MATRIX-POW(A^2, n / 2)

d. First, we should note that β=O(logn)\beta = O(\log n).

  • For part (a),

    We naively add a β\beta-bit number which is growing exponentially each time, so the recurrence becomes

    T(n)=T(n1)+T(n2)+Θ(β)=T(n1)+T(n2)+Θ(logn), \begin{aligned} T(n) & = T(n - 1) + T(n - 2) + \Theta(\beta) \\ & = T(n - 1) + T(n - 2) + \Theta(\log n), \end{aligned}

    which has the same solution T(n)=O(1+52)nT(n) = O\Big(\frac{1 + \sqrt 5}{2}\Big)^n since Θ(logn)\Theta(\log n) can be absorbed by exponential time.

  • For part (b),

    The recurrence of the memoized verstion becomes

    M(n)=M(n1)+Θ(β).M(n) = M(n - 1) + \Theta(\beta).

    This has a solution of i=2nβ=Θ(nβ)=Θ(2ββ)\sum_{i = 2}^n \beta = \Theta(n\beta) = \Theta(2^\beta \cdot \beta) or Θ(nlogn)\Theta(n \log n).

  • For part (c),

    We perform a constant number of both additions and multiplications. The recurrence becomes

    P(n)=P(n/2)+Θ(β2),P(n) = P(n / 2) + \Theta(\beta^2),

    which has a solution of Θ(lognβ2)=Θ(β3)\Theta(\log n \cdot \beta^2) = \Theta(\beta^3) or Θ(log3n)\Theta(\log^3 n).