This is a snippet of the "Dynamic Programming" chapter of the book.
This chapter opens the final part of the book entitled "Selected Algorithms". In it, we're going to apply the knowledge from the previous chapters in analyzing a selection of important problems that are mostly application-independent and find usages in many applied domains: optimization, synchronization, compression, and similar.
We will start with a single approach that is arguably the most powerful algorithmic technic in use. If we managed to reduce the problem to Dynamic Programming (DP), in most of the cases, we can consider it solved. The fact that we progressed so far in this book without mentioning DP is quite amazing. Actually, we could have already talked about it several times, especially in the previous chapter on strings, but I wanted to contain this topic to its own chapter so deliberately didn't start the exposition earlier. Indeed, strings are one of the domains where dynamic programming is used quite heavily, but the technic finds application in almost every area.
Also, DP is one of the first marketing terms in CS. When Bellman had invented, he wanted to use the then hyped term "programming" to promote his idea. This has, probably, caused more confusion over the years than benefit. In fact, a good although unsexy name for this technic сould be simply "filling the table" as the essence of the approach is an exhaustive evaluating of all variants with memoization of partial results (in a table) to avoid repetition of redundant computations. Obviously, it will have any benefits only when there are redundant computations, which is not the case, for example, with combinatorial optimization. To determine if a problem may be solved with DP we need to validate that it has the optimal substructure property:
A problem has optimal substructure if when we take its subproblem an optimal solution to the whole problem includes an optimal solution to this subproblem.
An example of the optimal substructure is the shortest path problem. If the shortest path from point A to point B passes through some point C and there are multiple paths from C to B, the one included in the shortest path A-B should be the shortest of them. In fact, the shortest path is an archetypical DP problem which we'll discuss later in this chapter. A counterexample is a Travelling Salesman Problem (TSP): if it had optimal substructure the subpath between any two nodes in the result path should have been the shortest possible path between these nodes. But it isn't true for all nodes because it can't be guaranteed that the edges of the path will form a cycle with all the other shortest paths.
Fibonacci Numbers
So, as we said, the essence of DP is filling a table. This table, though, may have a different number of dimensions for different problems. Let's start with a 1d case. What book on algorithms can omit discussing the Fibonacci numbers? Usually, they are used to illustrate recursion, yet they are also a great showcase for the power of memoization. Besides, recursion is, conceptually, also an integral part of DP.
A naive approach to calculating the i
-th number will be directly coding the Fibonacci formula:
(defun naive-fib (i)
(assert (typep i '(integer 0)))
(if (< i 2) 1
(+ (naive-fib (- i 1))
(naive-fib (- i 2)))))
However, applying it will result in an exponential growth of the number of computations: each call to naive-fib
results in two more calls. So, the number of calls needed for the n
-th number, with this approach, is O(2^n)
.
> (time (naive-fib 40))
Evaluation took: 3.390 seconds of real time
165580141
> (time (naive-fib 42))
Evaluation took: 7.827 seconds of real time
433494437
Yet, we can see here a direct manifestation of an optimal substructure property: the i
-th number calculation uses the result of the (1- i)
-th one. To utilize this recurrence, we'll need to store the previous results and reuse them. It may be achieved by changing the function call to the table access. Actually, from the point of view of math, tables and functions are, basically, the same thing.
(let ((fib (vec 1 1))) ; our table will be an adjustable vector
(defun fib (i)
(when (< (length fib) i)
(vector-push-extend (fib (- i 1)) fib))
(+ (? fib (- i 1))
(? fib (- i 2)))))
What we've done here is added a layer of memoization to our function that uses an array fib
that is filled with the consecutive Fibonacci numbers. The array is hidden inside the closure of the fib
procedure, so it will persist between the calls to it and accumulate the numbers as they are requested. There will also be no way to clear it, apart from redefining the function, as the closed over variables of this kind are not accessible outside of the function. The consecutive property is ensured by the arrangement of the recursive calls: the table is filled on the recursive ascent starting from the lowest yet unknown number. This approach guarantees that each Fibonacci number is calculated exactly once and reduces our dreaded O(2^n)
running time to a mere O(n)
!
Such a calculation is the simplest example of top-down DP that is performed using recursion. Despite its natural elegance, it suffers from a minor problem that may turn significant, in some cases: extra space consumption by each recursive call. It's not only O(n)
in time, but also in space. The alternative strategy that gets rid of redundant space usage is called bottom-up DP and is based on loops instead of recursion. Switching to it is quite trivial, in this case:
(let ((fib (vec 1 1)))
(defun bottom-up-fib (i)
(let ((off (length fib)))
(adjust-array fib (1+ i) :fill-pointer t)
(dotimes (j (- (1+ i) off))
(let ((j (+ j off)))
(:= (aref fib j)
(+ (aref fib (- j 1))
(aref fib (- j 2)))))))
(aref fib i)))
> (time (bottom-up-fib 42))
Evaluation took: 0.000 seconds of real time
> (time (bottom-up-fib 4200))
Evaluation took: 0.004 seconds of real time
40512746637826407679504078155833145442086707013857032517543... ; this number is a Lisp's bignum that has unlimited size
Funny enough, a real-word-ready implementation of Fibonacci numbers ends up not using recursion at all...
More details about of the book may be found on its website.