Week 32 · Phase 4 — The Toolbox
The function that calls itself — and why that's not a paradox.
Photo · Hafsa LR / Unsplash
The chef can do something deeply weird: a function can call itself. The same function. With a simpler argument. And not paradoxically, but cleanly. Each call gets its own stack frame, its own local variables, its own copy of the parameters. The function eventually hits a small enough case that it stops calling itself and returns. The chain unwinds, results combine, and the answer falls out.
This trick — recursion — is the natural fit for a wide class of problems: anything that contains a smaller version of itself. Tree traversals, file-system walks, grammar parsing, factorial, Fibonacci, divide-and-conquer sorts, every game tree, every search. Whenever the problem has a self-similar structure, the cleanest code is recursive.
Every recursive function has the same shape:
// factorial — n! = n × (n-1) × ... × 1
int factorial(int n) {
if (n <= 1) return 1; // base case
return n * factorial(n - 1); // recursive step
}
factorial(5);
// = 5 * factorial(4)
// = 5 * 4 * factorial(3)
// = 5 * 4 * 3 * factorial(2)
// = 5 * 4 * 3 * 2 * factorial(1)
// = 5 * 4 * 3 * 2 * 1 = 120
Each call gets a fresh frame on the stack. The chef pushes all the way down to factorial(1), which returns 1 directly, and then the stack unwinds, multiplying as it goes.
// visit every node in a binary tree (in-order)
void traverse(Node* n) {
if (!n) return; // base: empty tree
traverse(n->left); // recurse
visit(n);
traverse(n->right); // recurse
}
Trees are recursive structures: a tree is either empty, or a node with a value and two trees. The traversal mirrors the structure exactly. To do this iteratively you'd need an explicit stack — much messier.
// merge sort — split, sort each half, merge
void mergesort(std::vector<int>& v) {
if (v.size() <= 1) return; // base
auto left = first_half(v);
auto right = second_half(v);
mergesort(left); // recurse
mergesort(right); // recurse
v = merge(left, right);
}
Mergesort, quicksort, fast Fourier transform, binary search — the entire family of divide-and-conquer algorithms is naturally recursive. They're also all O(n log n), and the log n factor comes from the depth of the recursion tree.
Three things that bite:
fib(n-2) over and over — exponentially many times. The fix is memoisation (cache results) or rewrite as iteration.Anything you can write recursively, you can write iteratively. Anything you can write iteratively, you can write recursively. They're equivalent in power. The difference is fit:
Modern compilers can often turn tail-recursive functions (where the recursive call is the last thing the function does) into iterative loops automatically — eliminating the stack frame growth. C++ does this opportunistically; some functional languages guarantee it.
Game-tree search — Minimax (Week 38), Alpha-Beta (Week 43) — is recursion at its purest: "to evaluate this position, evaluate each move's resulting position, then combine". The natural shape of the algorithm is recursive, and so is the natural shape of the implementation.
Backpropagation through computation graphs is also recursive in spirit: walk back through the graph, each node propagating gradients to its children. Modern AI frameworks usually flatten this into iterative form for performance, but the conceptual model is recursive.
When the problem looks like itself, the solution probably should too.
Now that we have recursion, we have the toolkit for a classic problem: putting things in order. Next week we meet five sorting algorithms — three of them historical curiosities, two of them genuinely useful, and the rule for picking between them.
Week 33 is Sorting.
Photo free under the Unsplash license. Dolls · Hafsa LR.